Centralizing national flood data in the cloud

nfie-flooded-property270x180

Researchers from the University of Texas collaborated with other researchers, federal agencies, commercial partners, and first responders to create the National Flood Interoperability Experiment (NFIE). They used Microsoft Azure to help build a prototype for a national flood data-modeling and mapping system with the potential to provide life- and cost-saving information to the public. The goals of the NFIE include standardizing data, demonstrating a scalable solution, and helping to close the gap between national flood forecasting and local emergency response.
Sharing flood information for better prediction and response
In October 2013, the Onion Creek area near Austin, Texas, faced a particularly destructive flood. While onsite studying the flood, David Maidment, professor of civil engineering at the University of Texas, spoke with Harry Evans, chief of staff for the Austin Fire Department. They realized that they had similar goals for improving flood prediction and response, and could collaborate well with their different areas of expertise.
Maidment, who specializes in hydrology and flooding at the Center for Research and Water Resources, brought together participants from academia, federal agencies, commercial partners, and first responders to create the National Flood Interoperability Experiment (NFIE). He wanted a technology infrastructure that would allow flood information to flow in from various agencies and academia, and then flow out to allow citizens and first responders to better understand what was happening.
“What we're trying to do in the National Flood Interoperability Experiment is to prototype a set of infrastructure and services that can communicate with one another and with the public in a uniform and open way,” says Maidment.

nfie-people-in-flood270x180

Microsoft Azure for data analysis, storage, and sharing in the cloud
Microsoft Research helped the NFIE find the computational power it needed in Microsoft Azure. The NFIE uses Azure to perform statistical analysis of present and past flood data to help design a prototype for a national flood data-modeling and mapping system with the potential to provide life- and cost-saving information to the public.
By using Azure, the NFIE can standardize and store data in the cloud. Maidment and colleagues at the University of Texas developed a new language that provides both a common way to store time-value pairs, like river flow time, and a standard way of communicating that information through the Internet. The US Geological Survey adopted this language to publish its time-series data on water observations, and the National Weather Service will also use it to publish forecasts as part of the NFIE. When this common language is implemented operationally, those organizations will be able to communicate and collaborate more efficiently with one another.
More flood information provides protential for improved public safety
NFIE uses Azure to deliver more forecasts than any one agency could. Currently, the National Weather Service makes forecasts at about 3,600 locations on rivers in the country. The NFIE expects to demonstrate delivery of specific and actionable data for 2.67 million locations nationally, including smaller streams. It also expects to increase the spatial density of flood forecast locations by a factor of more than 700, compared to the current National Weather Service system.
Ultimately, the NFIE has demonstrated that information with a greater level of detail has the potential to increase real-time responsiveness that can improve public safety and save lives. Working closely with the Austin Fire Department, the NFIE shows how data can be used to improve decision-making. Evans notes that this work will help the NFIE develop a template that agencies can use nationwide, along with their threat and risk analyses, to help communities better protect themselves from the risks of flooding.

Are you happy? Sad? Angry? Terrified? Microsoft knows…

face detection

By Bill Carmarda as written on nakedsecurity.sophos.com

Is something troubling you?
No, seriously, we saw you walking by, and you looked like something was on your mind. Something bad.
Well, it wasn’t exactly “us.” Our webcam caught you in a crowd. Then, our cloud-based face detection service flagged your mood. We don’t quite know what it is – we’re working on that! – but something’s bothering you.
Maybe we could help? To show you how much we care, here’s an ad for something to make you feel better…
Behold the next new power of the cloud!
This week, Microsoft announced the free public preview of Azure Media Face Detector, the latest in its growing set of speech and vision services running atop the Microsoft Azure cloud.
Microsoft’s core Face Detection service can detect and track up to 64 faces concurrently as they move around within an .MP4, .MOV, or .WMV video frame.
Once faces have been detected, Microsoft can pass them to its optional Emotion Detection component. That’ll analyse “multiple emotional attributes… including happiness, sadness, fear, anger, and more.”
As The Register noted, audio-based sentiment analysis tools are already in production helping call center reps figure out just how irate you are; now they’re coming to video, too.
Microsoft says Azure Media Face Detection can be used for “people counting, movement tracking, and even gauging audience participation and reaction via facial expressions.” If you’re using their neatly JSON-formatted data, you can instantly tell if a crowd likes what it’s seeing in your store window… or what it’s hearing from your political candidate.
Security isn’t mentioned as an application, but Microsoft may well be thinking about such matters. Just last week, Rudiger Dorn, Microsoft’s Director of Cloud Computing, told TechWeekEurope that Microsoft’s AI and machine learning Platform-as-a-Service (PaaS) offerings will offer significant competitive value in “allow[ing] you to track criminals and terrorists.”
As with most “early days” technologies, Microsoft’s service has plenty of limitations. It does best with faces looking straight at the camera. It can lose track of faces that slip in and out of frame. And perhaps most significantly, this specific “Media Processor” doesn’t put a name with a face.
For that, you’ll need to link Face Detection with Face Recognition. But, of course, those technologies are coming along quite nicely, too. Microsoft was recently promoting face recognition as part of the same Project Oxford research that’s being “productized” here.
Microsoft isn’t the only outfit diving into this space. Google’s Cloud Vision API also promises face detection and Image Sentiment Analysis to recognize emotions like joy, sorrow, and anger. Per Google:
Combine this with object detection and product logo detection, so you can assess how people feel about your logo.
(How they really feel – not what they told your focus group!)
Whether you’re a Microsoft or Google partisan, you’ll still be able to run these services at enterprise-scale.

Untangling airports using open source tools on Microsoft Azure

Scientists from the Universities of Stirling and Nottingham in the United Kingdom tackled the knotty problem of delays on airport taxiways, where planes are entering or leaving runways. Sandy Brownlee, PhD, and Jason Atkin, PhD, collaborated with Manchester Airport to use cloud computing to model the complex data from many airports worldwide. The team created open-source tools using Linux on Microsoft Azure to expand these insights and create new algorithms, sharing these on Github. The team is helping Manchester Airport to reduce delays, save money and lessen any environmental impact.
Tim Walmsley helps the third-largest airport in the United Kingdom – Manchester - manage an estimated 23 million passengers per year. To successfully plan airport operations and growth, he asked for data science help from university researchers, who specifically sought to gain insights from modeling the movements on taxiways, to and from the runways. “Aviation is an industry that’s growing. So there are lots of ways that the industry is trying to tackle the impacts that that growth could bring. The Airport Optimization Project feeds into that,” Walmsley, Environment Manager for Manchester, explained.
Sandy Brownlee, a senior research assistant at the University of Stirling, began helping Manchester Airport by searching for specific data on what is sometimes called “ground movement” or taxiing to populate a model. At first, he was frustrated because individual airports did not want to share everything with him. What he discovered, however, is that he could access public data using Flight Radar 24 and Open Street Map for dozens of airports worldwide. Jason Atkin, PhD, of the University of Nottingham, partnered with Brownlee to help model how taxiways can be leveraged to make airports more efficient.

airport_b-roll.00_29_30_07.

Taxiways connect everything
The time aircraft spend getting to and from runways is one of the understudied choke points at airports. “Taxiing is a really critical problem because it connects everything else,” Brownlee explained. Many are familiar with strategies for aligning takeoffs or landings to improve safety or efficiency but that slow crawl toward the gate (called a stand in the UK) can be a crucial link in the chain of events.
“The computing power we’ve got now allows us to understand and analyze data in different ways and pull out different information so we can better understand the true uncertainty in taxiing. We can understand which aircraft take a long time to get there, which aircraft get there quickly, and under what circumstances this is happening,” Atkin said.
Public data sources
Using Microsoft Azure, Brownlee could use Linux virtual machines and develop methods using OpenJDK. By leveraging these open source tools on Azure he completed his work in about one-tenth the time he might if he’d used just his desktop computer. “So rather than spending several months waiting for my data to be ready so that I could get on and do things, I had it within a couple of weeks,” he said.
There were three main tools that the team created to share on Github. TaxiGen reads taxiway and runway information from Open Street Map and then automatically writes it out in a usable format. SnapTracks reads raw GPS coordinates with timings and adds them to TaxiGen material. GM2KML generates helpful visualizations from the other two tools.
“Researchers rely on open tools and platforms to be able to develop and share their work. The ability to use the cloud for access to computing power not available on the desktop can act like a time machine, shrinking the time to results from months to weeks. This is a transformational way of thinking about research computing,” explained Kenji Takeda from Microsoft Research, who was supporting the project. Brownlee’s work on analysis of ground movement was funded by the Sandpit for Integrating and Automating Airport Operations and DAASE grants from the Engineering and Physical Sciences Research Council (EPSRC).
“By getting better predictions, you can start improving the rest of the airport system,” Atkin said. One pilot can take longer than another to cover the same ground, traffic congestion can be heavy at busy times, and mechanical delays of any sort can throw off predictions. Taxiing delays ripple through the entire system. Modeling and predicting that taxi time helps airports change when and where they direct planes and can yield big savings. Brownlee estimates modeling could help cut bottlenecks at Manchester in half.
Open source benefits
Because the tools created by the team are available to anyone, both Brownlee and Atkin foresee that other airports around the world will use them. “The work that Sandy’s doing is going to provide a lot of public domain data and the ability to analyze this for a lot of different airports. And we should be able to see these multi-million-pound savings at airports worldwide,” Atkin said.
Brownlee also hopes models will help guide decisions in weather emergencies or when a runway must be closed. Airports worldwide can use the modeling to understand what to do about a sudden change. “By getting more researchers worldwide involved … we could get a lot more benefit from different areas of knowledge all coming from the same problem,” he said.
No matter what the world does with the open-source tools, for Walmsley the great impact is at Manchester, where he expects “a much better experience for the customer and for the airlines using the airport.”

[vc_row][vc_column][vc_column_text]

343 Industries Gets New User Insights from Big Data in the Cloud

Microsoft Case Study

The Halo franchise is an award-winning collection of properties that has grown into a global entertainment phenomenon. To date, more than 50 million copies of Halo video games have been sold worldwide. As developers prepared to launch Halo 4, they were tasked with analyzing data to gain insights into player preferences and support an online tournament. To handle those requests, the team used a powerful Microsoft technology called Windows Azure HDInsight Service, based on the Apache Hadoop big data framework. Using HDInsight Service to process and analyze raw data from the Windows Azure cloud operating system, the team was able to feed game statistics to the tournament’s operator, which used the data to rank players based on game play. The team also used HDInsight Service to update Halo 4 every week and support a daily email campaign designed to increase player retention. Organizations can also take advantage of data to quickly make business decisions.
Situation
Halo 4 marks the beginning of a new saga in the blockbuster franchise that has shaped entertainment history and defined a generation of gamers. Developed by Microsoft Studios’ 343 Industries game studio exclusively for the Microsoft Xbox 360 video game and entertainment system, Halo 4 brings back the Master Chief character in a new, epic sci-fi adventure. Released in November 2012, the game achieved more than $220 million in global sales in its first 24 hours and attracted more than 4 million players in its first five days after launch.
For the Halo Services Team, a development team at 343 Industries that manages the game, one of the biggest challenges is scaling to meet player demands. That’s one reason the team uses Windows Azure to power the game’s back-end supporting services. These services run the game’s key multiplayer features, including leaderboards and avatar rendering. Hosting the multiplayer parts of the game in Windows Azure ensures that the team has a way to quickly and inexpensively add and remove server capacity as needed.
As the game was prepared for release, however, 343 Industries was faced with an entirely new kind of challenge: to gain insight into player behavior and user preferences. To achieve this goal, Microsoft leadership asked 343 Industries to find a way to effectively mine user data.
At the same time, the team was faced with another need: analyzing data during the five-week Halo 4 “Infinity Challenge” tournament and providing results each day to their tournament partner, Virgin Gaming. The Halo 4 Infinity Challenge, the largest free-to-enter online Halo tournament in the world, tracked a player’s personal score in the game’s multiplayer modes across a global leaderboard, giving players a chance to win more than 2,800 prizes. Virgin Gaming needed to use business intelligence (BI) data gathered during the event to update leaderboards on the tournament website.
To meet these business requirements, 343 Industries knew it needed to find a BI technology solution that would integrate with Windows Azure. “One of the great things about 343 Industries is how they use cutting-edge technology like Azure,” says Alex Gregorio, Program Manager for Microsoft Studios, which published Halo 4. “So we wanted to find the best BI environment out there, and we needed to make sure it integrated with Azure.”
Because all Halo 4 game data is housed in Windows Azure, the team wanted to find a solution that could effectively produce usable BI information from that data. The team also needed to process this data in the same data center, minimizing storage costs and avoiding charges for data transfers across two data centers. Additionally, the team wanted full control over job priorities, so that the performance and delivery of analytical queries would not be affected by other processing jobs run at the same time. “We had to have a flexible solution that was not on-premises,” states Gregorio.
The team began its search for a new BI solution in the months leading up to the scheduled November launch of Halo 4.
Solution
Although it initially considered building its own custom BI solution, 343 Industries ultimately decided to use HDInsight Service, which is based on Apache Hadoop, an open-source software framework created by Yahoo! Hadoop, which is ideal for running complex analytics, can analyze massive amounts of unstructured data in a distributed manner. HDInsight Service is a big data solution for Windows Azure that empowers users to gain new insights from unstructured data, while connecting that data to familiar Microsoft BI tools. “Even though we knew we would be one of the earliest customers of HDInsight Service, it met all our requirements,” says Tamir Melamed, Development Manager on the Halo Services Team. “It can run any possible queries, and it is the best format for integration with Azure.”
The team was particularly attracted to the flexibility of HDInsight Service, which allowed for separating the amount of the raw data from the processing size needed to consume that data. “With previous systems, we never had the separation between production and raw data, so there was always the question of how running analytics would affect production,” says Mark Vayman, Lead Program Manager for the Halo Services Team. “Hadoop solved that problem.”
HDInsight Service was also instrumental in changing the team’s focus from data storage to useful data analysis. That’s because Hadoop applies structure to data when it’s consumed, as opposed to traditional data warehouse applications that structure data before it is placed into a BI system.
The team wrote Windows Azure–based services that convert raw game data collected in Windows Azure into the Avro format, which is supported by Hadoop. This data is then pushed from the Windows Azure services in the Avro format into Windows Azure binary large object (BLOB) storage, which HDInsight Service is able to utilize with the ASV protocol. The data can then be accessed by anyone with the right permissions from Windows Azure.
Every day, Hadoop handles millions of data-rich objects related to Halo 4, including preferred game modes, game length, and many other items. With Microsoft SQL Server PowerPivot for Microsoft SharePoint as a front-end presentation layer, Windows Azure BLOBs are created based on queries from the Halo 4 team.
Microsoft SQL Server PowerPivot for Microsoft Excel loads data from HDInsight Service using the Hadoop Hive ODBC driver. A PowerPivot workbook is then uploaded to PowerPivot for SharePoint and refreshed nightly within a SharePoint site, using the connection string stored in the workbook via the Hive ODBC driver to HDInsight Service. The team uses the workbooks to generate reports and facilitate their viewing of interactive data dashboards.
Benefits
Using HDInsight Service, 343 Industries is more agile and can respond faster to customer requests. With the solution’s flexibility, the Halo Services Team is able to make weekly updates to the game and was able to help Virgin Gaming detect cheaters in the online Infinity Challenge tournament. HDInsight Service also supports customized email campaigns that the Halo marketing team is using to improve the user experience and retain players. In addition, the solution relies on familiar tools that can be used to simplify decision making.
Increases Agility and Speeds Response Time
With HDInsight Service, 343 Industries is more agile and can respond more quickly to business requests for BI. Part of the reason for this agility is the solution’s performance. With Hadoop, the team was able to build a configuration system that can be used to turn various Windows Azure data feeds on or off as needed. “That really helps us get optimal performance, and it’s a big advantage because we can use the same Azure data source to run compute for HDInsight Service on multiple clusters,” says Vayman. “It made it easy for us to drive business requests for analysis through an ad-hoc Hadoop cluster without affecting the jobs being run.”
And launching Hadoop clusters is a simple, fast process. “We can easily launch a new Hadoop cluster in minutes, run a query, and get back to the business in a few hours or less,” says Melamed. “Azure is very agile by nature, and Hadoop on Azure is more powerful as a result.”
Helps Halo 4 Team Make Weekly Game Updates
In addition to responding quickly to business requests, the Halo 4 team can take BI data pulled from the game each day and identify user trends, such as the average length of a game and the specific game features that players use the most. By getting these insights, the Halo 4 team can make frequent updates to the game. “Based on the user preference data we’re getting from Hadoop, we’re able to update game maps and game modes on a week-to-week basis,” says Vayman. “And the suggestions we get in the forums often find their way into the next week’s update. We can actually use this feedback to make changes and see if we attract new players. Hadoop and the forums are great tuning mechanisms for us.”
The team is also taking user feedback and giving it to the game’s designers, who can consider the suggestions in developing future editions of Halo.
Provides In-Game Analysis and Helps Identify Cheaters
Because Hadoop applies structure to data when it’s consumed, the team can focus more on analytics and less on storage. Instead of worrying about how to store and structure game data, the team can concentrate on what game modes users play in or how many users are playing at any given time. With this ability to focus more tightly on analysis, 343 Industries could meet the needs of Virgin Gaming. “Using Microsoft HDInsight Service, we were able to analyze the data during the five weeks of the Halo 4 Infinity Challenge,” says Vayman. “With the fast performance we got from the solution, we could feed that data to Virgin Gaming so it could update the leaderboards on the tournament website every day.”
In addition, because of the way the team set up Hadoop to work within Windows Azure, the team was able to detect cheaters during the Halo 4 Infinity Challenge. “HDInsight Service gives us the ability to easily read the data,” says Vayman. “In this case, there are many ways in which players try to gain extra points in games, and we could look back at previous data stored in Azure and identify user patterns that fit certain cheating characteristics, which was unexpected.” After receiving this data from the team, Virgin Gaming sent out a notification that any player found or suspected of cheating would be removed from the leaderboards and the tournament.
Contributes to Player Retention
The flexibility of the HDInsight Service BI solution also gives 343 Industries a way to reach out to players through customized campaigns, such as the series of email blasts the team sent to players immediately after the launch. For that campaign, the team set up Hadoop queries to identify users who started playing on a certain date. The team then wrote a file and placed it into a storage account on Windows Azure, where it was sent through Microsoft SQL Server 2008 R2 Integration Services into a database owned by the Microsoft Xbox marketing team.
The marketing team then used this data to send these new players emails customized by screening several variables including when they started playing Halo 4 and their game play behaviors. The choice of which email each player received was determined by the HDInsight Service system. “That gave marketing a new way to retain users and keep them interested by talking about new aspects of the game,” Gregorio says. The Halo marketing team plans to run similar email campaigns for the game until a new edition is released. “Basing an email campaign on HDInsight Service and Hadoop was a big win for the marketing team, and also for us,” adds Vayman. “It showed us that we were able to use data from HDInsight Service to customize emails, and to actually use BI to improve the player experience and affect game sales.”
Uses Familiar Tools to Simplify Decision Making
Microsoft has started to expand HDInsight Service to other internal groups, and one of the reasons adoption is growing is that users do not have to be engineers or Hadoop experts to take advantage of the technology. Data is collected in Windows Azure and made easily accessible through familiar productivity tools. “By hooking Hadoop into a set of tools that are already familiar, such as Microsoft Excel or Microsoft SharePoint, people can take advantage of the power of Hadoop without needing to know the technical ins and outs,” says Vayman. “A good example of that is the data about Halo 4 Infinity Challenge cheaters that we gave to Virgin Gaming. The people receiving that data are not Hadoop experts, but they can still easily use the data to make business decisions.”
Another reason Hadoop is becoming more widely used is that the technology continues to evolve into an increasingly powerful tool. “The traditional role of BI within Hadoop is expanding because of the raw capabilities of the platform,” says Brad Sarsfield, Microsoft SQL Server Developer. “In addition to just BI reporting, we’ve been able to add predictive analytics, semantic indexing, and pattern classification, which can all be leveraged by the teams using Hadoop.”
With these and other capabilities, there is little question that HDInsight Service will continue to positively affect business. “With Hadoop on Windows Azure, we can mine data and understand our audience in a way we never could before,” says Vayman. “It’s really the BI solution for the future.”
Windows Azure
With Windows Azure, you can quickly build, deploy, and manage applications across a global network of Microsoft-managed data centers. You can build applications using any operating system, language, or tool.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_cta_button2 h2="Contact us Today for More Information! " style="square" txt_align="center" title="CONTACT US" btn_style="square" color="amethyst" size="lg" position="bottom" css_animation="appear" accent_color="rgba(234,234,234,0.01)" link="url:%2Fcontact|title:Contact%20Us|"][/vc_cta_button2][/vc_column][/vc_row][vc_row][vc_column][/vc_column][/vc_row]

Communicate directly with customers via Skype for Business inside your mobile apps

By Skype for Business Team as written on blogs.office.com

The Skype for Business team is always on the lookout for new ways to bring greater value to our customers. We look for new and innovative ideas that connect people together utilizing the power of our platform. Today, we are pleased to announce that the Skype for Business App SDK Preview is now available for download. This new SDK enables developers to seamlessly integrate instant messaging, audio and video experiences into their custom iOS and Android applications.
At Build 2016, we previewed the Skype for Business App SDK and highlighted the ease of seamless integration into native mobile and tablet applications powered by Skype for Business. We showcased a real-world solution created by MDLIVE—a pioneer and visionary in telehealth and leading provider of online and on-demand healthcare delivery services and software—that connects patients and physicians together via mobile devices in a new, convenient and efficient way. And by collaborating with Microsoft and Office 365, MDLIVE is able to offer a secure and HIPAA-compliant system for patients and providers to communicate, share and review patient medical records, lab results and provide assessments.

Communicate-directly-with-customers-via-Skype-for-Business-inside-your-mobile-apps-1

“Skype for Business will provide MDLIVE with a much more scalable architecture, so we can accommodate even higher volumes of video consults daily,” said Randy Parker, founder and CEO of MDLIVE. “The adoption of Skype for Business also enables us to deliver a significantly improved user experience for both patients and physicians.”
The initial focus of the SDK Preview is to power “remote advisor” solutions that enable consumer mobile and tablet applications to communicate with Skype for Business organizations. Businesses can leverage the power of their existing Skype for Business Server and Skype for Business Online infrastructure—including the familiar native clients they use today—to reach customers never before possible.
Whether you are looking to add voice, video or chat functionality into a new or existing application, the Skype for Business App SDK Preview makes it easy. The availability of these features is an important step in our Skype Developer Platform roadmap to combine the power of cloud voice, meetings and messaging with new cloud APIs and SDKs that work across a range of web and device platforms to drive new scenarios and help developers and partners re-imagine how they engage and win customers.
Download the Skype for Business App SDK Preview today. We look forward to your feedback on these new features and can’t wait to see what you build!
This preview release is a part of the larger Skype for Business developer opportunities announced at Build 2016. For more information on the Skype Developer Platform or additional resources be sure to visit the Skype Developer Platform.
—James Skay, senior product marketing manager for the Skype for Business team

Using ChronoZoom to build a comprehensive timeline of climate change in the cloud

chronozoom-climate-mountain270x180

A professor at the University of California, Santa Barbara, explores the history of climate change in depth in his graduate-level Earth System Science class. To help students visualize events through the ages, he is developing a comprehensive history of climate change by using ChronoZoom, an open-source community project dedicated to visualizing the history of everything.
Building a historical view of climate change
Each year, Jeff Dozier, professor of Environmental Science and Management at the University of California, Santa Barbara, teaches a course in Earth System Science to between 80 and 100 incoming graduate students. Among the issues he teaches: climate record and how the Earth’s climate has changed through the ages—and what drivers are behind those changes.
Covering millions of years’ worth of warming trends within a class term is a challenge; managing the massive volumes of data, charts, videos, illustrations, and other support materials is even more daunting. Dozier needed a way to pull together his materials into an accessible—and manageable—manner.
He found the solution in the award-winning ChronoZoom tool.
ChronoZoom allows users to navigate through "time," beginning with the Big Bang up until present day events. Users can zoom in rapidly from one time period to another, moving through history as quickly or slowly as they desire. In 2013, a third-party authoring tool was built into ChronoZoom, enabling the academic community to share information via data, tours, and insights, so it can be easily visualized and navigated through Deep Zoom functionality.
Visual aids can have a particularly powerful impact when discussing climate change. Dozier is developing a history of the Earth that illustrates changes in climate from the beginning of the planet through modern day. The source materials include images, diagrams, graphs, and time-lapse movies that illustrate changes in the environment. Dozier plans to use the timeline as a teaching aid in his Earth System Science class.
“ChronoZoom has been easy to master and use,” Dozier notes. “You don’t need any sort of client-side application except a browser. All the data is stored on someone else’s machine. The processing is done in the cloud [through Windows Azure], not on your own computer. And the only thing that really shows up on your own computer is the results.” Moreover, thanks to the power of Windows Azure, the tool has the flexibility to scale up and down, enabling users to zoom in on a particular segment in time or zoom out to review climate change from the beginning of recorded history through today. Plus, content developers can share their presentations or timelines with others by simply sharing a link or posting it to a social media site.
Make your mark on history
ChronoZoom has already been used to illustrate the history of the Earth and explore the impact of climate change on the planet through the ages. There are many unexplored possibilities, however. The tool scales up and down, meaning any project can benefit—whether it’s the history of the world or just a review of the last few weeks. Dozier is hopeful others will use ChronoZoom to tell their stories by uploading their own data, images, and text to the cloud and using those materials in the classroom.

Preventing flood disasters with Cortana Intelligence Suite

By Kristin Tolle as written on blogs.msdn.microsoft.com

A4RFlood_WebGraphics_Blog_FINAL_900x300

On October 31, 2013, the city of Austin, Texas, faced a destructive flood. At the time, I was visiting David Maidment, Chaired Professor of the Civil Engineering Center for Research in Water Resources on site at the University of Texas at Austin. The day before the flood, we had been discussing research and analytics around the long-standing drought conditions across western Texas. Overnight, a flash flood wreaked havoc on the Austin area, largely due to the failure of a stream gauge on Onion Creek, which prevented local emergency response officials from being properly informed about the situation.
On the morning of October 30, the stream gauge monitoring Onion Creek’s was operational and reporting that the stream level was rising to dangerous levels. First responders were monitoring the gauge so that they would be prepared for sending out support crews. However, around 5:00 a.m., the stream level reported by the gauge dropped to zero—which is not uncommon in the southern United States, where washes and stream levels can quickly drop to normal levels once the initial precipitation pattern passes. With the disaster appearing to have been averted, emergency responder turned their attention elsewhere. In actuality, the gauge had failed, the stream overran its banks, and more than 500 homes flooded and five people died.

Since the Onion Creek event, every year and often several times each year, Texas and nearby Oklahoma have experienced several floods, some of which have been more deadly than the 2013 event. In May 2015 a flood in this region claimed 48 lives, including two first responders, Deputy Jessica Hollis of the Austin Police Department and Captain Jason Farley of the Claremore, Oklahoma, Fire Department.
Researchers from the University of Texas at Austin (UT Austin) are collaborating with other researchers, federal agencies, commercial partners, and first responders to create the National Flood Interoperability Experiment (NFIE). The goals of the NFIE include standardizing data, demonstrating a scalable solution, and helping to close the gap between national flood forecasting and local emergency response. The objective is to create a system that interoperates between different publically available data sources to model floods, based on predictions.
Systems for each of the 13 water regions in the United States were developed, two of them at Microsoft Research by my visiting researcher, Marcello Somos (New England region), and intern Fernando Salas (Gulf region), both from the UT Austin. After Marcello and Fernando returned to Austin, they collaborated with other institutions to create a national flood map for the entire nation. This interoperated data product was used by NOAA to run a summer institute at the National Water Center in Tuscaloosa, Alabama, with 38 top hydrology and meteorology graduate students from around the world.
My colleague Prashant Dhingra and I presented Microsoft Azure and the recently announced big data advanced analytics and intelligence platform, Cortana Intelligence Suite, to the students at the annual National Water Center Summer Institute. Several enterprising attendees created interesting analytics projects. Tim Petty, a PhD candidate at the University of Alaska, Fairbanks, wanted to address “the Onion Creek Problem,” and what we can do to estimate flood levels when stream gauges fail. And so project SHEM began.
Streamflow hydrology estimate using machine learning (SHEM) is a Cortana Intelligence Suite experiment that creates a predictive model that can act as a proxy streamflow data when a stream gauge fails. And due to the machine learning capabilities, it can even make estimates of stream levels where there is no actual stream gauge present.
SHEM differs from most existing models as it does not rely on distances between stream gauges and their location attributes, but is based solely on machine learning to process from historical patterns of discharge and interpret large volumes of complex hydrology data. This “training” prepares SHEM to predict streamflow information for a given location and time as it is impacted by multivariate attributes (for example, type of stream, type of reservoir, amount of precipitation, and surface and subsurface flow conditions).
Using Cortana Intelligence Suite (CIS), our joint research team was able to ingest, clean, refine, and format the historical US Geological Survey stream gauge data. We leveraged the Boosted Decision Tree Regression module which is one of many built-in machine learning algorithms. We also used built-in modules for data cleaning and transformation as well as modules for model scoring and evaluation. Wherever custom functionality is needed, you can add R or python modules directly to the workflow. And this is the advantage of Azure Machine Learning—that you can test multiple built-in or hand-coded algorithms and workflows in order to build an optimal solution rerunning and testing with reproducible results.
As with NFIE a year ago, SHEM is in the early stages of development and expanding it to more and more states is ongoing work. But the results bode well. All indications are that Cortana Intelligence Suite can use NFIE data and analysis products to effectively provide a reasonable estimate when a gauge is not present. Another byproduct of this experiment is that we can evaluate where there is the greatest variance in accuracy, which can, in turn, give us a good idea where it might be best to install new stream gauges.
And that should help all of us sleep a lot better—even in Austin.

Ford Uses Microsoft Cloud to Seamlessly Update Cars -Managed Solution

Ford Uses Microsoft Cloud to Seamlessly Update Cars

By Sharon Gaudin as written on cloudfortomorrow.com
Ford Motor Co. is moving to automatically update its cars’ infotainment systems using Microsoft’s Azure cloud service.
The U.S. auto maker will begin selling some cars later this year with a computer system that can be automatically updated anytime the car connects to a Wi-Fi network. The cloud-based system is expected to be available in all of Ford’s cars by the end of 2016, according to Don Butler, Ford’s executive director of Connected Vehicle and Services.
The system — the Ford Service Delivery Network — will enable car owners to more easily get new services, even if their car is as much as 10 years old.
Ford is using cloud computing, data analytics and in-car software to change the consumer experience. Now a car’s navigation, entertainment and communication systems will be refreshable.
Previously, Ford’s infotainment system could be updated — but only by bringing the vehicle into a dealership or via a USB stick.
With a cloud-based system, updates to the car’s navigation system, contacts, audio system and center touch screen will be easier.

Ford Uses Microsoft Cloud to Seamlessly Update Cars 2 -Managed Solution

The updates should download seamlessly, without the driver being distracted – or even aware – of it happening.
“We couldn’t do this without the cloud,” Butler told Computerworld. “It’s really the only way to do it. Otherwise, we’d still need people to make a physical connection either at the dealer or through a stick.”
Other auto companies, including Tesla Motors, will be using over-the-air updates and cloud services to upgrade car owners’ infotainment, safety and powertrains.
Zeus Kerravala, an analyst with ZK Research, said car owners rarely update their in-car systems when they have to drive to the dealer or get a USB stick. The cloud-based service should make it easier for systems requiring updates to work.
“I think it will be something that’s expected eventually,” he said. “Using the cloud is smart as it’s the only scalable way to do real time, over-the-air updates.”
To make this work, Ford is using a hybrid cloud system, which combines the features of a private and public cloud system.
Customer-sensitive data, such as the owner’s name and address, the vehicle’s mileage, its location and how well it’s running, will be stored on a private, on-premise cloud network that was built by Ford’s IT department.
For the public cloud, Ford is using Microsoft Azure, which the company uses for software updates.
Microsoft also is supplying the connection between Ford’s private cloud and the Azure public cloud.
“It’s a flexible solution that lets us tailor it to our needs,” Butler said. “Azure allows us the ability to flex between our own data centers and public data centers. And it’s global.”
That flexibility is the reason Ford’s IT executives chose Azure, instead of another cloud provider, like IBM, Google or Amazon Web Services (AWS).
“We wouldn’t have had that kind of flexibility with AWS or Google,” Butler noted. “You use their cloud as they constructed them. With Azure, we’ve constructed and architected our own service delivery network, and Azure is a component of that network. It gives us the ability to have a solution that bridges their public cloud and our own private cloud. Azure is the plumbing that connects the two clouds.”
Ford also had already worked with Microsoft on its in-vehicle software, so the Redmond, Wash., company came to the cloud job with an understanding of the automaker’s vision and needs.
“It’s important to work with a partner that understands the environment you’re trying to operate in,” Butler said. “Microsoft had that.”
Kerravala said he’s a little surprised that Ford selected Microsoft. “I think Azure versus AWS versus Google is really in the preference of the customer,” he added. “I think most people think of Amazon as being the de facto standard of the cloud, but Azure is a solid choice, too. Azure has Microsoft support behind it and may prove to be easier to grow long term because of that.”
Ezra Gottheil, an analyst with Technology Business Research, said it makes sense for Ford to stick with Microsoft since they have a history of working together.
“As far as picking Microsoft, Ford and Microsoft have been collaborating for a long time,” he noted. “And Azure is a good platform for managing digital assets on a network. Here, it’s really two things — the Azure fabric, which is a software services platform, and the Azure service, which is Microsoft’s hosted version of the platform. Both make sense here.”

Contact us Today!

Chat with an expert about your business’s technology needs.