Data is an omnipresent element within every organization. Data comes in from customers, employees, third-parties, or other external sources. It is up to each company to find ways on how to handle rapidly growing data and put it to good use. Smart businesses are already looking into ways how this data can address numerous issues within the organization and outside it, as well as how to differentiate themselves from the competition.

Some challenges arise when it comes to leveraging this information. With the many technological advancements over the past two decades, the amount of information coming in is growing at an almost exponential rate. What's more, most of this data is unstructured.

Structured data is much easier to handle. Businesses use it every day by making use of relational databases or by creating spreadsheets in Excel, to give a couple of examples. When this happens, various patterns emerge and can be easily identified.

The biggest issue in this context, however, is with unstructured data. It can come from numerous sources such as social media, emails, documents, blogs, videos, images, etc., and represent ample opportunities for businesses to grow and optimize their operations.

Unfortunately, however, unstructured data makes it much more difficult to gain any easy or straightforward insight by using conventional systems. What's more, much of the data that's generated nowadays is unstructured, making it vital for businesses to find ways on how to properly leverage it.

Cloud Migration

First things, first. With the overwhelming amount of data coming in on a daily basis, storing it on-site can become quite costly. On the one hand, having this data on-site can result in an over-provision, leading to further unnecessary costs. On the other hand, it can take a lot on onsite real-estate.

But by migrating your application and database to the cloud, none of the problems mentioned above will be an issue. With public cloud vendors such as AWS and Microsoft, you can pay as you go, meaning that you will have access to a much higher degree of flexibility and scalability than otherwise. In addition, keep in mind that a cloud provider will become an extension of your IT team once you've made the transition. And let's not forget that storing your data in the cloud also implies less real-estate expense.

Cognitive Computing

Cognitive computing (CC) refers to various technology platforms that make use of artificial intelligence (AI) and signal processing. These platforms also make use of machine learning, natural language processing (NLP), reasoning, speech recognition, human-computer interaction, dialog generation, among other such technologies.

CC can analyze unstructured data, interpret it, and generate insights based on all possible decisions using evidential support. These systems can be adaptive, meaning that they can learn as the information changes. They can also be interactive, seamlessly communicating with users as well as other devices and cloud services. And they can be contextual, in that they can understand, identify, and extract various contextual elements, from multiple sources and different sensory inputs such as visual, auditory, gestural, etc.

In short, cognitive computing will help businesses understand and structure disorderly data to put it to good use and get ahead of the competition.


Big data can offer plenty of opportunities for growth and profitability, but it can also pose a severe challenge if not leveraged correctly. For more information on the topic of data management and other related issues, visit our website or contact us directly.



New Azure services help more people realize the possibilities of big data

By T. K. “Ranga” Rengarajan as written on
This week in San Jose thousands of people are at Strata + Hadoop World to explore the technology and business of big data. As part of our participation in the conference, we are pleased to announce new and enhanced Microsoft data services: a preview of Azure HDInsight running on Linux, the general availability of Storm on HDInsight, the general availability of Azure Machine Learning, and the availability of Informatica technology on Azure.
These new services are part of our continued investment in a broad portfolio of solutions to unlock insights from data. They can help businesses dramatically improve their performance, enable governments to better serve their citizenry, or accelerate new advancements in science. Our goal is to make big data technology simpler and more accessible to the greatest number of people possible: big data pros, data scientists and app developers, but also everyday businesspeople and IT managers. Azure is at the center of our strategy, offering customers scale, simplicity and great economics. And we’re embracing open technologies, so people can use the tools, languages and platforms of their choice to pull the maximum value from their data.
Simply put, we want to bring big data to the mainstream.
Azure HDInsight, our Apache Hadoop-based service in the cloud, is a prime example. It makes it easy for customers to crunch petabytes of all types of data with fast, cost-effective scale on demand, as well as programming extensions so developers can use their favorite languages. Customers like Virginia Tech, Chr. Hanson, Mediatonic and many others are using it to find important data insights. And, today, we are announcing that customers can run HDInsight on Ubuntu clusters (the leading scale-out Linux), in addition to Windows, with simple deployment, a managed service level agreement and full technical support. This is particularly compelling for people that already use Hadoop on Linux on-premises like on Hortonworks Data Platform, because they can use common Linux tools, documentation, and templates and extend their deployment to Azure with hybrid cloud connections.











Storm for Azure HDInsight, generally available today, is another example of making big data simpler and more accessible. Storm is an open source stream analytics platform that can process millions of data “events” in real time as they are generated by sensors and devices. Using Storm with HDInsight, customers can deploy and manage applications for real-time analytics and Internet-of-Things scenarios in a few minutes with just a few clicks. Linkury is using HDInsight with Storm for its online monetization services, for example. We are also making Storm available for both .NET and Java and the ability to develop, deploy, and debug real-time Storm applications directly in Visual Studio. That helps developers to be productive in the environments they know best.
You can read this blog to learn about these and other updates we’re making to HDInsight to make Hadoop simpler and easier to use on Azure.
Azure Machine Learning, also generally available today, further demonstrates our commitment to help more people and organizations use the cloud to unlock the possibilities of data. It is a first-of-its-kind, managed cloud service for advanced analytics that makes it dramatically simpler for businesses to predict future trends with data. In mere hours, developers and data scientists can build and deploy apps to improve customer experiences, predict and prevent system failures, enhance operational efficiencies, uncover new technical insights, or a universe of other benefits. Such advanced analytics normally take weeks or months and require extensive investment in people, hardware and software to manage big data. Also, now developers – even those without data science training – can use the Machine Learning Marketplace to find APIs and finished services, such as recommendations, anomaly detection and forecasting, in order to deploy solutions quickly. Already customers like Pier 1, Carnegie Mellon, eSmart Systems, Mendeley and ThyssenKrupp are finding value in their data with Azure Machine Learning.

Azure Machine Learning reflects our support for open source. The Python programming language is a first class citizen in Azure Machine Learning Studio, along with R, the popular language of statisticians. New breakthrough algorithms, such as “Learning with Counts,” now allow customers to learn from terabytes of data. A new community gallery allows data scientists to share experiments via Twitter and LinkedIn, too. You can read more about these innovations and how customers are using Azure Machine Learning in this blog post.
Another key part of our strategy is to offer customers a wide range of partner solutions that build on and extend the benefits of Azure data services. Today, data integration leader Informatica is joining the growing ecosystem of partners in the Azure Marketplace. The Informatica Cloud agent is now available in Linux and Windows virtual machines on Azure. That will enable enterprise customers to create data pipelines from both on-premises systems and the cloud to Azure data services such as Azure HDInsight, Azure Machine Learning, Azure Data Factory and others, for management and analysis.
The value provided by our data services multiplies when customers use them together. A case in point is Ziosk, maker of the world’s first ordering, entertainment and pay-at-the table tablet. They are using Azure HDInsight, Azure Machine Learning, our Power BI analytics service and other Microsoft technologies to help restaurant chains like Chili’s drive guest satisfaction, frequency and advocacy with data from tabletop devices in 1,400 locations.
This week the big data world is focused on Strata + Hadoop World, a great event for the industry and community. It’s exciting to consider the new ideas and innovations happening around the world every day with data. Here at Microsoft, we’re thrilled to be part of it and to fuel that innovation with data solutions that give customers simple but powerful capabilities, using their choice of tools and platforms in the cloud.



Analytics 50: How big data innovators reap results

Five winners of the 2016 and Drexel University Analytics 50 awards share details of their projects, lessons learned and advice.

By Thor Olavsrud as written on

business man black and white in field

Data and analytics are reshaping organizations and business processes, giving organizations the capability to interrogate internal and external data to better understand their customers and drive transformative efficiencies.
Worldwide revenues for big data and business analytics clocked in at nearly $122 billion in 2015 and will grow to $187 billion in 2019, according to a five-year forecast from research firm IDC.
“Organizations able to take advantage of the new generation of business analytics solutions can leverage digital transformation to adapt to disruptive changes and create competitive differentiation in their markets,” said IDC analyst Dan Vesset in a statement issued in conjunction with the release of IDC’s Worldwide Semiannual Big Data and Analytics Spending Guide earlier this year. “These organizations don’t just automate existing processes — they treat data as they would any valued asset by using a focused approach to extracting and developing the value of information.”
Additionally, a recent Forrester Research study, commissioned by the global data and analytics team at KPMG, found that 50 percent of businesses now use data and analytics tools to analyze their existing customers, while 48 percent use them to find new customers and 47 percent use them to develop new products and services.
The picture isn’t entirely rosy, however. That same Forrester study found that many organizations are struggling to adjust their cultures to a world in which data and analytics play a central role, and many business executives mistrust the insights generated by data and analytics.
Other organizations, however, have taken naturally to data and analytics and are using new tools to better understand customers, develop new products and optimize business processes.
To honor those organizations, and Drexel University’s LeBow College of Business recently announced the first Analytics 50 awards. The winners represent a broad spectrum of industries, from pharmaceuticals and healthcare to sports and media.


Dividing cancer cell, SEMUsing data science to beat cancer

By Nancy Brinker as written on
The complexity of seeking a cure for cancer has vexed researchers for decades. While they’ve made remarkable progress, they are still waging a battle uphill as cancer remains one of the leading causes of death worldwide.
Yet scientists may soon have a critical new ally at their sides —  intelligent machines — that can attack that complexity in a different way.
Consider an example from the world of gaming: Last year, Google’s artificial intelligence platform, AlphaGo, deployed techniques in deep learning to beat South Korea Grand Master Lee Sedol in the immensely complex game of Go, which has more moves than there are stars in the universe.
Those same techniques of machine learning and AI can be brought to bear in the massive scientific puzzle of cancer.
One thing is certain — we won’t have a shot at conquering cancer with these new methods if we don’t have more data to work with. Many data sets, including medical records, genetic tests and mammograms, for example, are locked up and out of reach of our best scientific minds and our best learning algorithms.
The good news is that big data’s role in cancer research is now at center stage, and a number of large-scale, government-led sequencing initiatives are moving forward. Those include the U.S. Department of Veteran Affairs’ Million Veteran Program; the 100,000 Genomes Project in the U.K.; and the NIH’s The Cancer Genome Atlas, which holds data from more than 11,000 patients and is open to researchers everywhere to analyze via the cloud. According to a recent study, as many as 2 billion human genomes could be sequenced by 2025.
There are other trends driving demand for fresh data, including genetic testing. In 2007, sequencing one person’s genome cost $10 million. Today you can get this done for less than $1,000. In other words, for every person sequenced 10 years ago, we can now do 10,000. The implications are big: Discovering that you have a mutation linked to higher risk of certain types of cancer can sometimes be a life-saving bit of information. And as costs approach mass affordability, research efforts approach massive potential scale.
A central challenge for researchers (and society) is that current data sets lack both volume and ethnic diversity. In addition, researchers often face restrictive legal terms and reluctant sharing partnerships. Even when organizations share genomic data sets, the agreements are typically between individual institutions for individual data sets. While there are larger clearinghouses and databases operating today that have done great work, we need more work on standardized terms and platforms to accelerate access.
The potential benefits of these new technologies go beyond identifying risk and screening. Advances in machine learning can help accelerate cancer drug development and therapy selections, enabling doctors to match patients with clinical trials, and improving their abilities to provide custom treatment plans for cancer patients (Herceptin, one of the earliest examples, remains one of the best).
We believe three things need to happen to make data more available for use for cancer research and AI programs. First, patients should be able to contribute data easily. This includes medical records, radiology images and genetic testing. Laboratory companies and medical centers should adopt a common consent form to make it easy and legal for data sharing to occur. Second, more funding is needed for researchers working at the intersection of AI, data science and cancer. Just as the Chan Zuckerberg Foundation is funding new tool development for medicine, new AI techniques need to be funded for medical applications. Third, new data sets should be generated, focused on people of all ethnicities. We need to make sure that advances in cancer research are accessible to all.

Excel and big data

One of the great things about being on the Excel team is the opportunity to meet with a broad set of customers. In talking with Excel users, it’s obvious that significant confusion exists about what exactly is “big data.” Many customers are left on their own to make sense of a cacophony of acronyms, technologies, architectures, business models and vertical scenarios.
It is therefore unsurprising that some folks have come up with wildly different ways to define what “big data” means. We’ve heard from some folks who thought big data was working two thousand rows of data. And we’ve heard from vendors who claim to have been doing big data for decades and don’t see it as something new. The wide range of interpretations sometimes reminds us of the old parable of the blind men and an elephant, where a group of men touch an elephant to learn what it is. Each man feels a different part, but only one part, such as the tail or the tusk. They then compare notes and learn that they are in complete disagreement.
Defining big data
On the Excel team, we’ve taken pointers from analysts to define big data as data that includes any of the following:
•High volume—Both in terms of data items and dimensionality.
•High velocity—Arriving at a very high rate, with usually an assumption of low latency between data arrival and deriving value.
•High variety—Embraces the ability for data shape and meaning to evolve over time.
And which requires:
•Cost-effective processing—As we mentioned, many of the vendors claim they’ve been doing big data for decades. Technically this is accurate, however, many of these solutions rely on expensive scale-up machines with custom hardware and SAN storages underneath to get enough horsepower. The most promising aspect of big data is the innovation that allows a choice to trade off some aspects of a solution to gain unprecedented lower cost of building and deploying solutions.
•Innovative types of analysis—Doing the same old analysis on more data is generally a good sign you’re doing scale-up and not big data.
•Novel business value—Between this principle and the previous one, if a data set doesn’t really change how you do analysis or what you do with your analytic result, then it’s likely not big data.
At the same time, savvy technologists also realize sometimes their needs are best met with tried and trusted technologies. When they need to build a mission critical system that requires ACID transactions, a robust query language and enterprise-grade security, relational databases usually fit the bill quite well, especially as relational vendors advance their offerings to bring some of the benefits of new technologies to their existing customers. This calls for a more mature understanding of the needs and technologies to create the best fit.
Excel’s role in big data
There are a variety of different technology demands for dealing with big data: storage and infrastructure, capture and processing of data, ad-hoc and exploratory analysis, pre-built vertical solutions, and operational analytics baked into custom applications.
The sweet spot for Excel in the big data scenario categories is exploratory/ad hoc analysis. Here, business analysts want to use their favorite analysis tool against new data stores to get unprecedented richness of insight. They expect the tools to go beyond embracing the “volume, velocity and variety” aspects of big data by also allowing them to ask new types of questions they weren’t able to ask earlier: including more predictive and prescriptive experiences and the ability to include more unstructured data (like social feeds) as first-class input into their analytic workflow.
Broadly speaking, there are three patterns of using Excel with external data, each with its own set of dependencies and use cases. These can be combined together in a single workbook to meet appropriate needs.
When working with big data, there are a number of technologies and techniques that can be applied to make these three patterns successful.
Import data into Excel
Many customers use a connection to bring external data into Excel as a refreshable snapshot. The advantage here is that it creates a self-contained document that can be used for working offline, but refreshed with new data when online. Since the data is contained in Excel, customers can also transform it to reflect their own personal context or analytics needs.
When importing big data into Excel, there are a few key challenges that need to be accounted for:
•Querying big data—Data sources designed for big data, such as SaaS, HDFS and large relational sources, can sometimes require specialized tools. Thankfully, Excel has a solution: Power Query, which is built into Excel 2016 and available separately as a download for earlier versions. Power Query provides several modern sets of connectors for Excel customers, including connectors for relational, HDFS, SaaS (Dynamics CRM, SalesForce), etc. We’re constantly adding to this list and welcome your feedback on new connectors we should provide out of the box at our UserVoice.
•Transforming data—Big data, like all data, is rarely perfectly clean. Power Query provides the ability to create a coherent, repeatable and auditable set of data transformation steps. By combining simple actions into a series of applied steps, you can create a reliably clean and transformed set of data to work with.


Handling large data sources—Power Query is designed to only pull down the “head” of the data set to give you a live preview of the data that is fast and fluid, without requiring the entire set to be loaded into memory. Then you can work with the queries, filter down to just the subset of data you wish to work with, and import that.
•Handling semi-structured data—A frequent need we see, especially in big data cases, is reading data that’s not as cleanly structured as traditional relational database data. It may be spread out across several files in a folder or very hierarchical in nature. Power Query provides elegant ways of treating both of these cases. All files in a folder can be processed as a unit in Power Query so you can write powerful transforms that work on groups (even filtered groups!) of files in a folder. In addition, several data stores as well as SaaS offerings embrace the JSON data format as a way of dealing with complex, nested and hierarchical data. Power Query has a built-in support for extracting structure out of JSON-formatted data, making it much easier to take advantage of this complex data within Excel.
•Handling large volumes of data in Excel—Since Excel 2013, the “Data Model” feature in Excel has provided support for larger volumes of data than the 1M row limit per worksheet. Data Model also embraces the Tables, Columns, Relationships representation as first-class objects, as well as delivering pre-built commonly used business scenarios like year-over-year growth or working with organizational hierarchies. For several customers, the headroom Data Model is sufficient for dealing with their own large data volumes. In addition to the product documentation, several of our MVPs have provided great content on Power Pivot and the Data Model. Here are a couple of articles from Rob Collie and Chandoo.
Live query of an external source
Sometimes, either the sheer volume of data or the pattern of the analysis mean that importing all of the source data into Excel is either prohibitive or problematic (e.g., by creating data disclosure concerns).
Customers using OLAP PivotTables are already intimately familiar with the power of combining lightweight client side experiences in PivotTables and PivotCharts with scalable external engines. Interactively querying external sources with a business-friendly metadata layer in PivotTables allows users to explore and find useful aggregations and slices of data easily and rapidly.
One very simple way to create such an interactive query table external source with a large volume of data is to “upsize” a data model into a standalone SQL Server Analysis Services database. Once a user has created a data model, the process of turning it into a SQL Server Analysis Services cube is relatively straightforward for a BI professional, which in turn enables a centrally managed and controlled asset that can provide sophisticated security and data partitioning support.
As new technologies become available, look for more connectors that provide this level of interactivity with those external sources.
Export from an application to Excel
Due to the user familiarity of Excel, “Export to Excel” is a commonly requested feature in various applications. This typically creates a static export of a subset of data in the source application, typically exported for reporting purposes, free from the underlying business rules. As more applications are hosted in the browser, we’re adding new APIs that extend integration options with Excel Online.
We hope we were able to give you a set of patterns to help make discussions on big data more productive within your own teams. We’re constantly looking for better ways to help our customers make sense of the technology landscape and welcome your feedback!

Microsoft's Cortana Technology Tickets

Paul Allen’s Portland Trail Blazers use Microsoft’s Cortana technology to sell tickets and engage fans

By Taylor Soper as written on

Microsoft co-founder Paul Allen probably didn’t think that the company he co-founded would one day help his NBA team sell more tickets and engage with fans more effectively. But that’s exactly what the Portland Trail Blazers are doing with Microsoft’s Cortana Intelligence Suite.
The team, owned by Allen since 1988, is using Microsoft’s big data predictive analytics software to learn more about how fans might purchase season tickets, as the company details in this blog post. The Trail Blazers’ marketing team uses Cortana Intelligence to identify different purchasing and attendance patterns among fans to figure out who might be more likely to become a new season ticket holder.
We first learned of this pilot program last month at the Moda Center in Portland, where the Trail Blazers hosted reporters for a run-down of the technology that the front office uses to improve the fan experience, sell more tickets, and much more.

Microsoft's Cortana Technology Tickets Image 1

Speaking of season tickets, the Blazers recently allowed season-ticket holders to use the team’s mobile app to renew their contracts for next season, becoming the first franchise in professional sports to do so. Vincent Ircandia, senior vice president of business operations, noted that if the Blazers tried something similar even three years ago, “we would have gotten incredible pushback.”
But now fans are more comfortable using their phones as a remote control for their lives, and the Blazers are responding. It took a lot of horsepower to activate this feature, with a mobile app partner, the renewal site developer, and Ticketmaster all participating, but the team is happy with the investment.

Microsoft's Cortana Technology Tickets Image 2

“We had 95 percent of season ticket holders renew, and 5 percent did so in the app,” Ircandia said. “We think that is significant. That is a lot of money.”
The team is partnering with a handful of local tech companies to test everything from proximity beacons inside the Moda Center to a parking app to barcode-scanning technology. You can read more about that here.
After losing four starters and entering the season with the league’s lowest payroll, the Blazers are perhaps the biggest surprise of the NBA in 2016. They crushed expectations and won 44 regular season games before again trumping critics with a playoff series win over the Los Angeles Clippers. The team is down 2-1 to the defending champion Golden State Warriors in the second round. Game 4 is Monday night in Portland.

To distinguish Grant Thornton International Ltd (GTIL) from its larger competitors, its member firms must maintain closer relationships with—and deliver personalized attention to—their clients. The independent member firms of GTIL are deploying Microsoft Dynamics CRM to facilitate collaboration around sales, marketing, and customer service. The solution has driven an increase in the number and size of sales wins, while helping the firms offer better value to clients and maintain consistency in a complex environment.

In an industry that has long been dominated by the Big Four, the Grant Thornton organization is under constant pressure to differentiate itself from its larger, better-known competitors. To do this, Grant Thornton member firms, which operate in more than 130 countries, offer their clients something the competition cannot: personalized service and closer relationships.

“We are not, nor do we want to be, the biggest accounting organization,” explains Rick Stow, head of Client Relationship Management. “We compete with four much larger firms, so we differentiate by having closer relationships to earn credibility and increase value. We choose to be much more connected to our clients. We need to have a good presence in our local markets if we're going to compete.”

Grant Thornton LLP, which operates in more than 56 offices in the United States, maintained various customer management solutions (including SalesLogix); they lacked visibility into contacts, accounts, and influencers across all employees and practice areas. This sometimes led to duplication of effort and missed opportunities. Further, because account teams were not aware of all interactions with a specific customer, it was difficult to gauge the strength or extent of the relationship. With no way to identify opportunities for cross-selling services, the member firms’ ability to grow the business was limited.

With a single standardized solution, Grant Thornton LLP aims to ensure that the firm—rather than individual employees—maintains the relationship with each contact.

Going mobile

Grant Thornton member firms in the United States, Canada, and Germany currently use Microsoft Dynamics CRM Online and on-premises for sales force automation, marketing management, and comprehensive account planning functionality built using the xRM extensibility capabilities of Microsoft Dynamics. In addition, two add-on solutions augment Grant Thornton LLP’s account and customer data with third-party research, including data from InsideView and LinkedIn. Future plans include adding social listening capabilities that will capture additional information from public sources.

The upgrade to Microsoft Dynamics CRM 2013 will also allow Grant Thornton LLP to fully exploit the mobile capabilities of the Microsoft Dynamics solution. Today, a handful of people access the system on Windows 8 Surface devices, but because most employees spend a majority of their time outside the office, making functionality available on virtually any device or platform will be key to successful, widespread adoption.

“More and more people are asking for a true ‘app’ experience,” explains Stow. “One of the most appealing aspects of Microsoft Dynamics CRM Online is the ability to deliver the same experience across multiple platforms. Whether you’re opening a form from within Outlook, from a browser, or on a mobile device, you have access to the information and full functionality.”

Fostering collaboration

Finding ways to foster collaboration among accounting and advisory professionals is key to Grant Thornton LLP’s continued success. “Relationships are everything,” says Stow. “So people are very protective of their relationships. But to understand the full scope of a client relationship, we need to overcome that individual mindset and find ways to collaborate effectively.”

In addition to establishing Microsoft Dynamics CRM as a standardized solution for CRM, Grant Thornton LLP has found that other technologies have been key to helping employees share information and work better together across member firms. A large-scale SharePoint deployment includes more than 50,000 unique sites where employees share documents and collaborate on accounts and initiatives. Communications and enterprise social media tools, including Lync and Yammer, provide voice, instant messaging and conferencing capabilities — and connect to Microsoft Dynamics CRM Online to facilitate communication around client and account activities.

As a result, the employees in Grant Thornton LLP can depend on one another and deliver more comprehensive service to clients. “It’s not just about using the technology,” says Stow. “Our workforce today is more connected — and more productive. People see the interactions and work more efficiently, and that ultimately allows us to serve our clients better.”

Enhancing sales processes

Microsoft Dynamics provides Grant Thornton LLP with a process-driven CRM solution that accommodates different sales processes and regional requirements. For existing accounts, the connection capabilities of the solution allow each member firm to build virtual account teams that include all employees and partners involved in a particular relationship, who can collaborate on a plan to nurture and proactively drive the relationship forward.

Using the collected data in the CRM solution, account managers see strengths, weaknesses, and threats related to a particular relationship or opportunity. They identify competitors that might be serving the client — in addition to potential partners in other advisory roles who might influence sales — and collaboratively develop a plan to pursue the opportunity.

Since deploying Microsoft Dynamics CRM, Grant Thornton LLP has experienced a 450 percent jump in the number of opportunities it is tracking, a 36 percent increase in average win value, and a 700 percent rise in the number of contacts in the system. In addition, by tracking employee behaviors in the system, Grant Thornton LLP correlates factors such as sales activities and frequency of client contacts to win rate and revenue generation so as to further refine the sales process through insights.

Delivering better client service

Stow is quick to point out that the benefits of the CRM solution extend beyond sales. “Our goal for the solution is not to push additional services on our clients, but to better serve our clients,” he says. “The enhanced planning and collaboration tools enable us to respond to clients more effectively and in a more timely fashion.”

In particular, he points to the ability to conduct proactive planning using custom account plans developed in Microsoft Dynamics CRM. The tool helps Grant Thornton LLP identify priority accounts and includes a complete history of account activity and plans for nurture or contact efforts. Connections to partners and other service providers enable Grant Thornton LLP to identify partners who might enhance the value proposition through collaboration.

Using data to drive value

The value proposition of Grant Thornton LLP is also enhanced by the analytical capabilities of the CRM solution. Using CRM in conjunction with business intelligence tools such as Power BI, Grant Thornton LLP can identify trends or issues affecting a particular class or segment of clients and then reach out proactively to other clients in that segment to address these issues.

The development of new services is also driven by analytics, which provide insights into client demand based on service line, service type, revenue, and other factors — which makes their very smart people even smarter from a sales, marketing, and business development standpoint.

“The solution enables us to be much more focused,” says Stow. “We can use data to make data-driven decisions and communicate to our teams using actual data. Everyone here is very smart, but the additional insights — based on a single shared version of the truth — make us all smarter.”

Read customer success stories to learn how Managed Solution helps businesses implement technology productivity solutions.

For information on Microsoft Dynamics CRM, please call us at 800-257-0691.


What is a CRM? Part 2

By Ben Ward, Applications Analyst, MCTS, MCP, MS
If you haven’t read What is a CRM? Part 1, I highly recommend reading it before proceeding.
Sales - Social Selling: A CRM, such as Microsoft Dynamics CRM, can be used as a social selling tool. Microsoft partnered with InsideView to create Microsoft Insights, a social insight tool that directly integrates within Microsoft Dynamics CRM to provide users with instant additional information regarding leads, contacts or accounts from directly within Microsoft Dynamics CRM. This additional information is derived from Microsoft Insights scraping publicly accessible social and business profiles regarding leads and contacts and aggregating the data. Currently, Microsoft Insights is free for Microsoft Dynamics CRM Online.
According to InsideView, Microsoft Insights can do the following:
1. Identify Your Buyer
  • Instantly qualify prospects with high quality company data
  • Expand on previous wins by looking for similar companies
  • Identify decision makers in an account by functional area or title
2. Understand your buyer
  • Save hours of research time with up-to-the-minute company news and social media buzz at your fingertips
  • Increase your win rate by tailoring your pitch to immediate customer needs
3. Engage your buyer
  • Start a conversation with a target buyer by noting mutual acquaintances, past employers you have in common or schools you both attended
  • Get a warm introduction into an account by leveraging your professional network and those of your colleagues

Marketing - Big Data Repository: As mentioned on the first line of What is a CRM? Part 1, A CRM is simply a glorified database. This means if a CRM is implemented and used correctly, it can collect a vast amount of data in a short space of time. In a world where there are more mobile devices than people, more and more data is being collected in some form, and with big data comes big opportunity. There may not be a need right now for a full-time analyst at your company, but what about in the near future? An analyst is only as good as the data available. Wouldn’t it be beneficial to have legacy data available for when the inevitable moment arrives when your business needs an analyst? As Arthur Conan Doyle once said “Data! data! data! I can’t make bricks without clay”.
About the author:
Ben has worked at Managed Solution for over two years and is currently working on CRM customization and administration, Microsoft SharePoint integration and customization as well as Business Intelligence analytics including SQL reporting. Ben is a Microsoft Certified Technology Specialist, Microsoft Certified Professional, Microsoft Specialist and has six Dynamic CRM certifications.
Other Blog Posts by Ben Ward:

Contact us Today!

Chat with an expert about your business’s technology needs.