[vc_row][vc_column][vc_column_text]As written by Ryan Fuller on blogs.office.com

Microsoft Workplace Analytics—a powerful new organizational analytics solution—is now generally available as an add-on to any Office 365 enterprise plan.

According to a recent Forrester report, increasing employee productivity is the number one priority for C-level executives in the next year, with 96 percent of respondents citing it as a critical or high imperative. Workplace Analytics provides unprecedented behavioral insights that can be used to improve productivity, workforce effectiveness and employee engagement.

New insights from Office 365

Workplace Analytics taps into Office 365 email and calendar metadata, including to/from data, subject lines and timestamps, to shine a light on how the organization collaborates and spends time. It turns this digital exhaust—the data that comes naturally from our everyday work—into a set of behavioral metrics that can be used to understand what’s going on in an organization.[/vc_column_text][grve_single_image image="17404"][vc_column_text]Microsoft has enabled Workplace Analytics with built-in privacy and compliance capabilities. Customers own their Office 365 data and decide how to apply insights generated by Workplace Analytics to solve tough business challenges. Workplace Analytics only leverages metadata that is aggregated and de-identified.

Workplace Analytics was designed with the flexibility to address a broad range of strategic and organizational culture-based initiatives. Let’s take a look at a few ways customers are using Workplace Analytics:

Sales productivity

A sales organization in a Fortune 500 company used Workplace Analytics to identify the collaborative patterns of top performers and then scaled those behaviors to the broader sales organization—resulting in a significant increase in sales. Some of these insights were expected, like the amount of time spent with customers. But others were new, like the size of the person’s internal network, which may be an indicator of the salesperson’s ability to get answers and solve customer questions.

[/vc_column_text][grve_single_image image="17405"][vc_column_text]

Manager effectiveness

Freddie Mac used Workplace Analytics to drive a cultural change with managers. In looking at how time-usage metrics are related to engagement and retention, they found that the behaviors of managers were pivotal in determining employee engagement and retention. Behaviors, such as 1:1 manager time, level of leadership exposure given to employees and the degree to which work can be distributed evenly across an organization, are measurable through Workplace Analytics.

Space planning

The collaboration insights from Workplace Analytics were used by an organization to partner with its commercial real estate company, CBRE, to do space planning. They analyzed the metadata attached to employee calendar items to calculate the travel time associated with meetings. They found that as a result of the relocation, each employee reduced their travel time to meetings by 46 percent—resulting in a combined total of 100 hours saved per week across all 1,200 employees involved in the move.

Customized queries

Every organization has unique business questions, which is why we’ve included the ability to create custom queries directly within Workplace Analytics. Data analysts can choose from a unique set of collaboration metrics to explore activities and trends within the business, including time spent in email, time in meetings, after-hours time and network size. Analysts can also create custom queries and filter to aggregated population subsets including regions, roles and functions.

“Workplace Analytics is becoming an essential part of our toolkit,” said Tom Springer, partner at Bain. “It shows us where and how our clients are deploying their scarcest resources: the time, talent and energy of their people. Workplace Analytics consistently yields unique insights into resource allocation, collaboration behaviors and organizational networks. We integrate these insights with broader perspectives on strategy, operating model and results delivery to help our clients organize for maximum productivity.”

Building a digital, data-driven enterprise

At Microsoft, Workplace Analytics has yielded significant insights. “We believe building a true digital, data-driven enterprise requires organizations to empower and connect their people across everything—people, processes, data and systems,” said Kathleen Hogan, chief people officer at Microsoft. “Our HR Business Insights group is using Workplace Analytics across a variety of initiatives—from understanding the behaviors driving increased employee engagement, to identifying the qualities of top-performing managers who are leading Microsoft’s cultural transformation from within. We believe people analytics is a competitive necessity for any HR team.”[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_row_inner css=".vc_custom_1534362068036{padding-top: 20px !important;padding-right: 20px !important;padding-bottom: 20px !important;padding-left: 20px !important;}"][vc_column_inner][vc_column_text]

Managed Solution is in the top 1% of Microsoft Cloud Service Providers worldwide, and a premier partner aligned with Microsoft’s mission to empower every person and every organization on the planet to achieve more.
Download our Cloud Comparison Calculator to receive access to the latest in cloud pricing aggregation, your all up cost of on premises vs. a cloud hosted solution

[/vc_column_text][grve_button button_text="Cloud Comparison Calculator" button_color="blue" button_hover_color="white"][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row]

Eco-testing a building before it is even built

green-prefab-building2a

Architects and engineers all over the world are inventing new ways to reduce the time, cost, and risk of constructing energy-efficient, high-performance buildings. Data-intensive analysis plays a key role in the design of “green” buildings—but the high cost of such analysis can be prohibitive to these eco-pioneers. Microsoft Azure Azure provides a way to help building designers perform complex data analysis cost-efficiently—and quickly—facilitating the design of energy-efficient buildings.
Using pre-fabricated parts and fast computers
Despite the global demand for sustainably-designed buildings, many design businesses face practical implementation challenges, such as the time-intensive process of performing computer simulations, and the expenses of the powerful technology that is needed to reduce execution time, sustainable design specialists, and computer aided design (CAD) software. The good news: cloud computing has tremendous potential to change all of that.
Green Prefab, a small startup company in Italy, is working with Microsoft Research Connections and the Royal Danish Academy to develop next-generation tools that will one day allow in-depth simulations of a building’s performance—before it's built. This innovative approach is possible by using Microsoft Azure, Microsoft’s open and powerful public cloud platform—to provide inexpensive data-intensive analysis.
Green Prefab is developing a library of prefabricated green building components that can be used to design eco-friendly buildings. Architects will be able to access civil engineering services, via the cloud, to produce energy efficiency reports, conduct in-depth structural analysis, and view photo-realistic preview images of the building. Green Prefab has teamed up with Microsoft Research Connections to develop some of the first tools for Microsoft Azure.
One essential ingredient of Green Prefab’s industrialized approach is its use of a data model that was developed for the construction industry in the 1990s by an international consortium known as buildingSMART. The buildingSMART model is an open format that makes it easy to exchange and share building information modeling (BIM) data between applications that were developed by different software vendors.
The open format of buildingSMART's data model has made it easier for Green Prefab to model prefabricated green building components. New tools that use massive computational power to simulate building performance will help the sustainable building industry.
Developing energy simulations
With the goal of making it possible for engineers and architects to analyze complex building scenarios extremely quickly, Green Prefab and the Institute of Architectural Technology of the Royal Danish Academy collaborated to validate the potential usefulness of building performance energy simulations in the cloud.
The Royal Danish Academy conducted an experiment that used Green Prefab’s prototype web-based tools with the supercomputer in Barcelona, Spain, to execute parametric energy simulations of buildings by using the power of cloud computing.
The design of the test building reflected the floor space, occupancy, and environmental setting of a standard office in Copenhagen, Denmark. In order to understand the advantages of the proposed service, in comparison to conventional ways of using simulations, a parallel experiment was conducted. Starting from the same building design, the same architect conceived and tested 50 design options with a standard dual-core PC.
The cloud-based approach achieved approximately twice the potential energy savings, 33 percent, compared to only 17 percent for the conventional approach. It also reduced computing time significantly. Running the 220,184 parametric simulations on a standard dual-core PC would have taken 122 days; running those same energy simulations in the cloud took only three days.
Reducing building time, cost, and risk
The wide adoption of cloud-based civil engineering tools could radically reshape the green building industry; Green Prefab's photo-realistic, 3-D illustrations of buildings in design are just the first step. By producing digital, full-detailed models of a building, Green Prefab expects to be able to guarantee its appearance and performance, save construction time, and reduce costs as much as 30 percent.
Even small architectural firms will be able to control costs in the pre-construction phase and reduce uncertainties during construction as civil engineering tools in the cloud become available to small and medium-sized architecture and engineering firms around the world.
Microsoft’s collaboration with Green Prefab presents an optimistic picture of a future in which new cloud-based tools help reduce the energy consumption of buildings substantially. Such scientific breakthroughs will facilitate a shift towards building more environmentally friendly buildings that use energy and water efficiently, reduce waste, and provide a healthy environment for working and living.

Fire App Fights Wildfires with Data

fire-app-blaze

Every second counts when combating a wildfire. Time lost can result in devastating loss of life or property. The University of the Aegean in Greece developed the VENUS-C Fire app—featuring Bing Maps, Microsoft Silverlight, and Windows Azure—to calculate and visualize the risk of wildfire ignition and to simulate fire propagation in the Greek island of Lesvos during its dry season. The university team generates a visualization of environmental factors each morning for the island’s fire management team, who then use the app to determine optimal resource allocation across the island for the day.
The fire app
The Fire app wildfire management software was designed by the Geography of Natural Disasters Laboratory at the University of the Aegean in Greece in 2011. Earlier, Microsoft Research partnered with the lab during the development phase, providing IT expertise, high-performance computing resources, and cloud computing infrastructure.
The app was built with functionality from multiple resources, giving it both technological depth and a visual interface that is accessible to non-technical users. "[The Fire app] nicely integrates Bing Maps, Microsoft Silverlight, and Windows Azure in a single system that allows users to be able to see the big picture of an emerging fire or the potential of an emerging fire," observes Dennis Gannon, director of Cloud Research Strategy for Microsoft Research Connections.
All of the Fire app data is stored in the cloud via Windows Azure. "You need a large cloud infrastructure such as Windows Azure to be able to bring these sources together," Gannon explains." The use of massive data analytics and machine learning is now the new frontier in many areas of science."
"With the cloud computing infrastructure, we were able to do business as we couldn’t do in the past," states Dr. Kostas Kalabokidis, associate professor, University of the Aegean. "[Windows Azure] is essential for us, because the cloud provides us with the necessary processing power and storage that is required. That means the real end users for the fire department do not need to have any huge processing power or storage capabilities locally."
Tracking risk factors daily
There are two distinct sets of users accessing Fire app daily during the dry season: the lab team, which loads new information into the tool in the morning; and the emergency responders, including the fire service, fire departments, and civil protection agencies that address wildfires on the island of Lesvos. They use the tool to view the data in a refined, graphical view. The process starts with the forecast.
"Every morning, our systems ask the Windows Azure cloud to provide approximately 20 virtual machines in order to process the available weather data," explains Dr. Christos Vasilakos, research associate, University of the Aegean. "It then stores the fire-risk outputs that the user needs to see and make the proper call. From the fire-risk menu, the end user can see for the next 120 hours or five days what will be the fire ignition risk for our study area." Additional information, including an animation of the weather for the next 120 hours, also can be accessed through the same menu.
The information is updated in the morning. The Fire Brigade of Greece uses the fire-risk data and fire simulations, together with weather forecast information, to inform the day's resource allocations. Based on the Fire app projections, personnel and fire trucks may be deployed throughout the island to areas that appear to be at particular risk that day.
The simulator also provides crucial information during fires. The firefighters who aren't dispatched to the fire use the Fire app at the station to create a wildfire simulation for the blaze. The team begins with the ignition point and pulls in other critical data to determine the fire’s potential path.
Fighting fire in the cloud
Cloud computing and storage are not merely integral to the Fire app; they are enabling significant advances throughout the research world.
"The data tsunami is changing everything in science. Every discipline is now confronted with it—a vast exploration of data that comes from instruments, from online sources, from the web, from social media," observes Gannon. "Analyzing this data can’t be done on a PC." Cloud computing, and the processing power that accompanies it, has made it possible for researchers to reduce processing job times from months to just hours.
Kypriotellis believes it has made a difference on the island. While wildfires do still break out, statistical evidence shows the department has been better prepared to respond to and control fires, preventing potential loss of life and property. He is hopeful that, one day, other firefighters will be able to add the tool to their arsenal as well.

FaST-LMM and Windows Azure Accelerate Genetics Research

fast-lmm-title

Today, researchers can collect, store, and analyze tremendous volumes of data; however, technological and storage limitations can severely impede the speed at which they can analyze these data. A new algorithm that was developed by Microsoft Research, called FaST-LMM (Factored Spectrally Transformed Linear Mixed Models), runs on Windows Azure in the cloud and expedites analysis time—reducing processing periods from years to just days or hours. An early application of FaST-LMM and Windows Azure helps researchers analyze data for the genetic causes of common diseases.
Searching for DNA Clues to Disease
The Wellcome Trust in Cambridge, England, is researching the genetic causes of seven diseases—including hypertension, rheumatoid arthritis, and diabetes. The project involves searching for combinations of genomic information to gain insight into an individual’s likelihood to develop one of these diseases. With a database containing genetic information from 2,000 people and a shared set of approximately 13,000 controls for each of the seven diseases, they needed both massive storage and powerful computation capacity.
They are storing their vast database of genetic information in the Windows Azure cloud, instead of traditional hardware storage, which represents a profound shift in how big data are stored. ”We are taking on the challenge of taking what would be traditional high-performance computing, one of the hardest workloads to move to the cloud, and moving to the cloud,” observes Jeff Baxter, development lead in the Windows HPC team at Microsoft. “There’s a variety of both technical and business challenges, which makes it exciting and interesting.”
Exploring the Power of the Cloud
Resource management is one of the primary issues associated with big data: not only determining how many resources are required for the project, but also identifying the right type of resources—within the available budget. For example, running a large project on fewer machines might save on hardware costs but result in substantial project delays. Researchers must find a balance that will keep their project on track while working with available resources.
The FaST-LMM algorithm can analyze enormous datasets in less time than existing alternatives. Microsoft Research also has the infrastructure that is required to perform the computations, explains David Heckerman, distinguished scientist at Microsoft Research. With more CPUs dedicated to a job, computations that would ordinarily take years to finish can be completed in just hours.
For the Wellcome Trust project, the team’s available resources included a combination of Windows HPC Server, Windows Azure, and the FaST-LMM algorithm. The team knew that they had a powerful set of technologies. The question was, could it achieve the results required in the desired timeline?
“For this project, we would need to do about 125 compute years of work. We wanted to get that work done in about three days,” explains Baxter. By running FaST-LMM on Windows Azure, the team had access to tens of thousands of computer cores and an improved algorithm that was able to expedite the work. “You’re still doing hundreds of compute years of work,” he explains, “but with these resources, we can actually do hundreds of compute years in a couple of days.”
While the results were impressive, there was something that had an even bigger impact. “The most impressive thing was how quickly we could take this project from inception to actually completing it and generating new science,” Baxter notes. “This is stuff that, without both the improvements in the algorithms that the Microsoft Research guys had come up with and the ability for us to provide the tens to hundreds of thousands of cores, would have been infeasible.”
The Future for Big Data Research
The Wellcome Trust project is just the beginning of what could be a major shift in how research databases are stored and analyzed. “With this new, huge amount of data that’s coming online, we’re now able to find connections between our DNA and who we are that we could never find before,” Heckerman says. The ability to analyze that data more quickly, and with greater depth, could help scientists make faster breakthroughs in genetic research—and breakthroughs in critical genetic research. The FaST-LMM algorithm running on Windows Azure is helping to accelerate just such breakthroughs.

Preventing flood disasters with Cortana Intelligence Suite

By Kristin Tolle as written on blogs.msdn.microsoft.com

A4RFlood_WebGraphics_Blog_FINAL_900x300

On October 31, 2013, the city of Austin, Texas, faced a destructive flood. At the time, I was visiting David Maidment, Chaired Professor of the Civil Engineering Center for Research in Water Resources on site at the University of Texas at Austin. The day before the flood, we had been discussing research and analytics around the long-standing drought conditions across western Texas. Overnight, a flash flood wreaked havoc on the Austin area, largely due to the failure of a stream gauge on Onion Creek, which prevented local emergency response officials from being properly informed about the situation.
On the morning of October 30, the stream gauge monitoring Onion Creek’s was operational and reporting that the stream level was rising to dangerous levels. First responders were monitoring the gauge so that they would be prepared for sending out support crews. However, around 5:00 a.m., the stream level reported by the gauge dropped to zero—which is not uncommon in the southern United States, where washes and stream levels can quickly drop to normal levels once the initial precipitation pattern passes. With the disaster appearing to have been averted, emergency responder turned their attention elsewhere. In actuality, the gauge had failed, the stream overran its banks, and more than 500 homes flooded and five people died.

Since the Onion Creek event, every year and often several times each year, Texas and nearby Oklahoma have experienced several floods, some of which have been more deadly than the 2013 event. In May 2015 a flood in this region claimed 48 lives, including two first responders, Deputy Jessica Hollis of the Austin Police Department and Captain Jason Farley of the Claremore, Oklahoma, Fire Department.
Researchers from the University of Texas at Austin (UT Austin) are collaborating with other researchers, federal agencies, commercial partners, and first responders to create the National Flood Interoperability Experiment (NFIE). The goals of the NFIE include standardizing data, demonstrating a scalable solution, and helping to close the gap between national flood forecasting and local emergency response. The objective is to create a system that interoperates between different publically available data sources to model floods, based on predictions.
Systems for each of the 13 water regions in the United States were developed, two of them at Microsoft Research by my visiting researcher, Marcello Somos (New England region), and intern Fernando Salas (Gulf region), both from the UT Austin. After Marcello and Fernando returned to Austin, they collaborated with other institutions to create a national flood map for the entire nation. This interoperated data product was used by NOAA to run a summer institute at the National Water Center in Tuscaloosa, Alabama, with 38 top hydrology and meteorology graduate students from around the world.
My colleague Prashant Dhingra and I presented Microsoft Azure and the recently announced big data advanced analytics and intelligence platform, Cortana Intelligence Suite, to the students at the annual National Water Center Summer Institute. Several enterprising attendees created interesting analytics projects. Tim Petty, a PhD candidate at the University of Alaska, Fairbanks, wanted to address “the Onion Creek Problem,” and what we can do to estimate flood levels when stream gauges fail. And so project SHEM began.
Streamflow hydrology estimate using machine learning (SHEM) is a Cortana Intelligence Suite experiment that creates a predictive model that can act as a proxy streamflow data when a stream gauge fails. And due to the machine learning capabilities, it can even make estimates of stream levels where there is no actual stream gauge present.
SHEM differs from most existing models as it does not rely on distances between stream gauges and their location attributes, but is based solely on machine learning to process from historical patterns of discharge and interpret large volumes of complex hydrology data. This “training” prepares SHEM to predict streamflow information for a given location and time as it is impacted by multivariate attributes (for example, type of stream, type of reservoir, amount of precipitation, and surface and subsurface flow conditions).
Using Cortana Intelligence Suite (CIS), our joint research team was able to ingest, clean, refine, and format the historical US Geological Survey stream gauge data. We leveraged the Boosted Decision Tree Regression module which is one of many built-in machine learning algorithms. We also used built-in modules for data cleaning and transformation as well as modules for model scoring and evaluation. Wherever custom functionality is needed, you can add R or python modules directly to the workflow. And this is the advantage of Azure Machine Learning—that you can test multiple built-in or hand-coded algorithms and workflows in order to build an optimal solution rerunning and testing with reproducible results.
As with NFIE a year ago, SHEM is in the early stages of development and expanding it to more and more states is ongoing work. But the results bode well. All indications are that Cortana Intelligence Suite can use NFIE data and analysis products to effectively provide a reasonable estimate when a gauge is not present. Another byproduct of this experiment is that we can evaluate where there is the greatest variance in accuracy, which can, in turn, give us a good idea where it might be best to install new stream gauges.
And that should help all of us sleep a lot better—even in Austin.

power bi

The latest release of Power BI gives us a great opportunity to look at how some smart folks in the industry view the latest trends in business intelligence. Here are five ones you should know:

BI for everyone

BI used to be the sole province of data experts in IT or information analyst specialists. Now the move is toward democratizing the access and analysis of data across the organization. Today, a data analyst could be a product manager, a line-of-business executive, or a sales director. Browser-based analytics now enable business users themselves to answer impromptu questions that are relevant to their expertise, and then create sophisticated visualizations to share with others. More companies are recognizing BI for everyone as a strategic advantage. They’re supporting business users with tools to help empower them as data analysts.
What better tool for empowering business users than Power BI? With features that simplify accessing datasets, creating visualizations, and sharing reports, Power BI enables users to integrate data analysis in their work, whatever their role. When everyone can run analyses, amazing insights and discoveries can happen.

Self-service analytics

Industry analysts now predict that within two years, most business users in organizations will have access to self-service tools to prepare data for analysis. Such self-service BI solutions can transform business users from data consumers to active data analysts, reducing the time and complexity of data gathering and preparation, and shifting the monopoly on data extraction, processing, and visualization from IT to a model of data analysis across the organization.
Power BI is a complete self-service data analysis tool available right now, enabling all users to make smart decisions with data. Connect with ease to internal data sources and external data services, such as Microsoft Dynamics, Salesforce, and QuickBooks. Process data with drag-and-drop gestures. Use natural language to query datasets and create compelling visualizations. And share your reports with colleagues using content packs. Power BI is at the forefront of tools that help cultivate and strengthen data-savvy knowledge workers.

Real-time analytics

Static reports are giving way to interactive presentations. Interactivity enables business users to explore and answer questions with data updated in real time. Monitoring the latest data helps decision makers respond with accuracy and agility.
With Power BI you’re no longer limited to static presentations. Include real-time data slides in your business presentations. Explore data interactively and flexibly to answer critical business questions on-the-fly. Refresh data in reports and presentations with real-time updates, helping speed insights, and drive faster, informed decision-making.

Data integration

Increases in data volume, velocity, and variety is fueling a trend toward comprehensive BI solutions that process information from multiple sources and in multiple views. Massive amounts of data are now available from disparate sources, increasing the demand for rapid data source integration accessible through simple interfaces.
Businesses are drawing upon huge volumes of unstructured social data to gain insights into customer behavior. Tracking social conversations at scale enables companies to learn when a topic is trending and what their customers are talking about. Insights gleaned from social data analytics lead to responsive optimization of products and services.
Power BI gives business users across the organization an easy-to-use tool to tap into insights hidden in large amounts of data. Whether the data reside in the cloud or on-premises, in structured databases or unstructured data processed by Hadoop, it’s accessible through Power BI. Use the Power BI visualization tools to communicate social trends to colleagues. As social trends evolve, have real-time updates reflected in your visualizations, enabling more agile responses to emerging market changes.

Mobile BI

The workforce is more mobile than ever, and mobile solutions for data analytics are maturing. Knowledge workers can now access and analyze data from their mobile devices more readily than ever. The trend toward mobile BI solutions will only continue to accelerate.
Power BI enables you to access and modify your dashboards no matter where you are, using touch-enabled native apps for Windows, iOS, and Android. Use the Power BI app to connect to your data, discover insights easily with data alerts, and share them with your team. The Power BI app also enables you to filter and pivot your data in different ways to quickly find answers on the go through your mobile device.
These trends point to the evolution of BI toward making new sources of information accessible, consumable, and meaningful to organizations of all sizes, including those that do not have advanced analytics skills or in-house resources.
The demand for self-service BI tools will spread to more businesses. The increase in big data means more organizations want broadly deployable, easy-to-use, and often cloud-based technologies for query, analysis, and reporting. Emerging data preparation capabilities now let business users extend self-service to information access, management, and data visualization, enabling them to prepare, integrate, curate, model, and enrich data that’s shareable with colleagues and stakeholders.

power bi trial button managed solution

Source: http://blogs.msdn.com/b/powerbi/archive/2015/08/18/five-key-trends-in-business-intelligence.aspx

Contact us Today!

Chat with an expert about your business’s technology needs.