Eco-testing a building before it is even built

green-prefab-building2a

Architects and engineers all over the world are inventing new ways to reduce the time, cost, and risk of constructing energy-efficient, high-performance buildings. Data-intensive analysis plays a key role in the design of “green” buildings—but the high cost of such analysis can be prohibitive to these eco-pioneers. Microsoft Azure Azure provides a way to help building designers perform complex data analysis cost-efficiently—and quickly—facilitating the design of energy-efficient buildings.
Using pre-fabricated parts and fast computers
Despite the global demand for sustainably-designed buildings, many design businesses face practical implementation challenges, such as the time-intensive process of performing computer simulations, and the expenses of the powerful technology that is needed to reduce execution time, sustainable design specialists, and computer aided design (CAD) software. The good news: cloud computing has tremendous potential to change all of that.
Green Prefab, a small startup company in Italy, is working with Microsoft Research Connections and the Royal Danish Academy to develop next-generation tools that will one day allow in-depth simulations of a building’s performance—before it's built. This innovative approach is possible by using Microsoft Azure, Microsoft’s open and powerful public cloud platform—to provide inexpensive data-intensive analysis.
Green Prefab is developing a library of prefabricated green building components that can be used to design eco-friendly buildings. Architects will be able to access civil engineering services, via the cloud, to produce energy efficiency reports, conduct in-depth structural analysis, and view photo-realistic preview images of the building. Green Prefab has teamed up with Microsoft Research Connections to develop some of the first tools for Microsoft Azure.
One essential ingredient of Green Prefab’s industrialized approach is its use of a data model that was developed for the construction industry in the 1990s by an international consortium known as buildingSMART. The buildingSMART model is an open format that makes it easy to exchange and share building information modeling (BIM) data between applications that were developed by different software vendors.
The open format of buildingSMART's data model has made it easier for Green Prefab to model prefabricated green building components. New tools that use massive computational power to simulate building performance will help the sustainable building industry.
Developing energy simulations
With the goal of making it possible for engineers and architects to analyze complex building scenarios extremely quickly, Green Prefab and the Institute of Architectural Technology of the Royal Danish Academy collaborated to validate the potential usefulness of building performance energy simulations in the cloud.
The Royal Danish Academy conducted an experiment that used Green Prefab’s prototype web-based tools with the supercomputer in Barcelona, Spain, to execute parametric energy simulations of buildings by using the power of cloud computing.
The design of the test building reflected the floor space, occupancy, and environmental setting of a standard office in Copenhagen, Denmark. In order to understand the advantages of the proposed service, in comparison to conventional ways of using simulations, a parallel experiment was conducted. Starting from the same building design, the same architect conceived and tested 50 design options with a standard dual-core PC.
The cloud-based approach achieved approximately twice the potential energy savings, 33 percent, compared to only 17 percent for the conventional approach. It also reduced computing time significantly. Running the 220,184 parametric simulations on a standard dual-core PC would have taken 122 days; running those same energy simulations in the cloud took only three days.
Reducing building time, cost, and risk
The wide adoption of cloud-based civil engineering tools could radically reshape the green building industry; Green Prefab's photo-realistic, 3-D illustrations of buildings in design are just the first step. By producing digital, full-detailed models of a building, Green Prefab expects to be able to guarantee its appearance and performance, save construction time, and reduce costs as much as 30 percent.
Even small architectural firms will be able to control costs in the pre-construction phase and reduce uncertainties during construction as civil engineering tools in the cloud become available to small and medium-sized architecture and engineering firms around the world.
Microsoft’s collaboration with Green Prefab presents an optimistic picture of a future in which new cloud-based tools help reduce the energy consumption of buildings substantially. Such scientific breakthroughs will facilitate a shift towards building more environmentally friendly buildings that use energy and water efficiently, reduce waste, and provide a healthy environment for working and living.

Fire App Fights Wildfires with Data

fire-app-blaze

Every second counts when combating a wildfire. Time lost can result in devastating loss of life or property. The University of the Aegean in Greece developed the VENUS-C Fire app—featuring Bing Maps, Microsoft Silverlight, and Windows Azure—to calculate and visualize the risk of wildfire ignition and to simulate fire propagation in the Greek island of Lesvos during its dry season. The university team generates a visualization of environmental factors each morning for the island’s fire management team, who then use the app to determine optimal resource allocation across the island for the day.
The fire app
The Fire app wildfire management software was designed by the Geography of Natural Disasters Laboratory at the University of the Aegean in Greece in 2011. Earlier, Microsoft Research partnered with the lab during the development phase, providing IT expertise, high-performance computing resources, and cloud computing infrastructure.
The app was built with functionality from multiple resources, giving it both technological depth and a visual interface that is accessible to non-technical users. "[The Fire app] nicely integrates Bing Maps, Microsoft Silverlight, and Windows Azure in a single system that allows users to be able to see the big picture of an emerging fire or the potential of an emerging fire," observes Dennis Gannon, director of Cloud Research Strategy for Microsoft Research Connections.
All of the Fire app data is stored in the cloud via Windows Azure. "You need a large cloud infrastructure such as Windows Azure to be able to bring these sources together," Gannon explains." The use of massive data analytics and machine learning is now the new frontier in many areas of science."
"With the cloud computing infrastructure, we were able to do business as we couldn’t do in the past," states Dr. Kostas Kalabokidis, associate professor, University of the Aegean. "[Windows Azure] is essential for us, because the cloud provides us with the necessary processing power and storage that is required. That means the real end users for the fire department do not need to have any huge processing power or storage capabilities locally."
Tracking risk factors daily
There are two distinct sets of users accessing Fire app daily during the dry season: the lab team, which loads new information into the tool in the morning; and the emergency responders, including the fire service, fire departments, and civil protection agencies that address wildfires on the island of Lesvos. They use the tool to view the data in a refined, graphical view. The process starts with the forecast.
"Every morning, our systems ask the Windows Azure cloud to provide approximately 20 virtual machines in order to process the available weather data," explains Dr. Christos Vasilakos, research associate, University of the Aegean. "It then stores the fire-risk outputs that the user needs to see and make the proper call. From the fire-risk menu, the end user can see for the next 120 hours or five days what will be the fire ignition risk for our study area." Additional information, including an animation of the weather for the next 120 hours, also can be accessed through the same menu.
The information is updated in the morning. The Fire Brigade of Greece uses the fire-risk data and fire simulations, together with weather forecast information, to inform the day's resource allocations. Based on the Fire app projections, personnel and fire trucks may be deployed throughout the island to areas that appear to be at particular risk that day.
The simulator also provides crucial information during fires. The firefighters who aren't dispatched to the fire use the Fire app at the station to create a wildfire simulation for the blaze. The team begins with the ignition point and pulls in other critical data to determine the fire’s potential path.
Fighting fire in the cloud
Cloud computing and storage are not merely integral to the Fire app; they are enabling significant advances throughout the research world.
"The data tsunami is changing everything in science. Every discipline is now confronted with it—a vast exploration of data that comes from instruments, from online sources, from the web, from social media," observes Gannon. "Analyzing this data can’t be done on a PC." Cloud computing, and the processing power that accompanies it, has made it possible for researchers to reduce processing job times from months to just hours.
Kypriotellis believes it has made a difference on the island. While wildfires do still break out, statistical evidence shows the department has been better prepared to respond to and control fires, preventing potential loss of life and property. He is hopeful that, one day, other firefighters will be able to add the tool to their arsenal as well.

[vc_row][vc_column][vc_column_text]

The next phase of Microsoft Academic: intelligent bots at your service!

By Kuansan Wang as written on blogs.msdn.microsoft.com

Kuansan_Wang

Progress in AI research and applications is exploding, and that explosion extends to our own team working on academic services. Continuing our work supercharging Bing and Cortana, we are also applying new technologies to Microsoft Academic, which serves the research community. If you’re not familiar with Microsoft Academic, this online destination helps researchers connect with the papers, conferences, people, and ideas that are most relevant, using bots that read, understand, and deliver the scientific news and papers researchers need to further their work.
Designed by and for researchers like myself, the site puts the broadest and deepest set of scientific information at your fingertips, with the ability to go beyond keywords to the contextual meaning of the content. Recently, we further enhanced the analytic content so users can see the latest research, news, and people, ranked by importance and credibility. Users can even drill down on the people, events, and institutions they care most about.
Behind the scenes, we are taking advantage of the fact that machines do not require time to sleep or eat, and have superior memory to humans. We have trained our AI robots to read, classify, and tag every document published to the web in real time. The result is a massive collection of academic knowledge we call the Microsoft Academic Graph (MAG), which is growing at roughly 1 million articles per week. While one set of robots is busy gathering knowledge from the web, another set of robots is dedicated to analyzing citation behaviors and computing the relative importance of each node in the MAG so that users are always presented with information they need and want.
Microsoft Academic is based on the work our team developed for Microsoft Cognitive Services, including open APIs that give developers AI-based semantic search tools and entity-linking capabilities. We’re also applying AI semantic search—which is contextual and conversational—to Cortana, Bing, and more.
As a research organization, we understand the pivotal role that open communication plays in advancing science. As such, we’re making the back-end dataset and algorithms available to all through Cognitive Services. There, everyone can access and conduct research on the massive and growing dataset through the cloud-based APIs. This means you don’t have to worry about the logistics of transmitting the massive dataset over the Internet, or manage a cluster of computers just to host and analyze the data. We are particularly excited that the research community has taken advantage of these cloud resources and already is collaborating on a common data and benchmarks platform to advance the state of the art. Earlier this year, we saw 81 teams participate in the WSDM Cup 2016 to develop new methods to rank papers, including newly published ones that have yet to receive any citations. An ongoing challenge is the KDD Cup 2016, which is focused on finding a better way to rank the importance of research institutions. The results of the first two stages of the contest have already been published, and I cannot wait to see the final outcomes and learn what new insights and technologies the 500 participating teams have developed when results are announced in August at KDD 2016 in San Francisco!
I encourage you to start experiencing the breadth and depth of what Microsoft Academic currently has to offer and to continue this journey with us in our mission to empower every academic and every academic institution on the planet to achieve more.

[/vc_column_text][/vc_column][/vc_row]

Understanding cloud forests through the power of cloud computing

cloud-forests-tower270x180

The Brazilian Cloud Forest Sensing Project is studying how cloud forests function in response to climatic variability. The project deployed more than 700 Internet enabled sensors connected to Microsoft Azure and is gathering integrated data on physical and biological processes within the study site. Through its partnership with Microsoft Research, the Brazilian Cloud Forest Sensing Project has created a repeatable Internet-of-Things (IoT) solution that revolutionizes how research can benefit from the use of a wireless sensor network, cloud technology, and automated data stream processing.
Researching cloud forests at work
Brazil is one of the most forested countries in the world. More than 60 percent of Brazil is covered by forest, including many cloud forests—moist forests characterized by persistent low cloud cover. Cloud forests help provide clean water because its trees intercept water from clouds. That water then drips onto the soil and feeds rivers, lakes, and irrigation systems, even during periods of low rainfall.
The Brazilian Cloud Forest Sensing Project is an initiative of the São Paulo Research Foundation (FAPESP) Biodiversity Research Program, supported by the Microsoft Research-FAPESP Joint Research Center. Rafael Oliveira, a professor of Ecology from the University of Campinas (Unicamp), conducts research for the project in the cloud forests of Campos do Jordão, Brazil.
The goal of Oliveira’s investigation is to understand how cloud forests work and then measure the impact of microclimatic variation on several ecosystem processes. The research project focuses on a fragmented forest—which most forests in the world are—in contrast to a continuous forest such as the ones found in the Amazon. Most fragmented forests are in proximity to urban areas and are critical to the water supply of those communities.
Managing sensor data in the cloud
Collaborators from the Brazilian Cloud Forest Sensing Project partnered with Microsoft Research to define research questions and develop software to analyze streams of data, which the Sensing Project team gathered every 15 minutes from a unified ensemble of more than 700 sensors on plants, in soil, and above tree canopies throughout the forest.
To manage and process high volumes of complex data, the Sensing Project team uses Microsoft Azure to store, process, and visualize the data coming in from the sensors. The sensors themselves are connected to Azure as an instance of the Internet of Things: devices embedded in the physical world sending data to the cloud and changing their behavior based on the directives assigned to them.
The sensors are networked to and communicate directly with the Microsoft Azure cloud platform. Because researchers have a constant stream of real-time data, they can quickly observe what's happening and if necessary, remotely change how the sensors collect data. This ability to make fast adjustments provides the researchers with high-fidelity data for specific time periods.
Providing a model for future research
Using a network of 700 sensors to study a forest was a new concept, and the team had to determine all the pertinent details, such as how to manage the sensors, which sensors were needed, and what kinds of data to measure. The researchers at the Sensing Project have created a repeatable technical solution, and researchers worldwide will be able to learn from it and apply these practices to their own sensor-based studies. The methodologies that are being developed will help Brazil’s investigations and other global research projects.

Microsoft research project puts cloud in ocean for the first time

By Athima Chansanchai as written on news.microsoft.com

Microsoft-Project-Natick-0412-1600x700

In 2015, starfish, octopus, crabs and other Pacific Ocean life stumbled upon a temporary addition to the seafloor, more than half a mile from the shoreline: a 38,000-pound container. But in the ocean, 10 feet by 7 feet is quite small. The shrimp exploring the seafloor made more noise than the datacenter inside the container, which consumed computing power equivalent to 300 desktop PCs.
But the knowledge gained from the three months this vessel was underwater could help make future datacenters more sustainable, while at the same time speeding data transmission and cloud deployment. And yes, maybe even someday, datacenters could become commonplace in seas around the world.
The technology to put sealed vessels underwater with computers inside isn’t new. In fact, it was one Microsoft employee’s experience serving on submarines that carry sophisticated equipment that got the ball rolling on this project. But Microsoft researchers do believe this is the first time a datacenter has been deployed below the ocean’s surface. Going under water could solve several problems by introducing a new power source, greatly reducing cooling costs, closing the distance to connected populations and making it easier and faster to set up datacenters.

A little background gives context for what led to the creation of the vessel. Datacenters are the backbone of cloud computing, and contain groups of networked computers that require a lot of power for all kinds of tasks: storing, processing and/or distributing massive amounts of information. The electricity that powers datacenters can be generated from renewable power sources such as wind and solar, or, in this case, perhaps wave or tidal power. When datacenters are closer to where people live and work, there is less “latency,” which means that downloads, Web browsing and games are all faster. With more and more organizations relying on the cloud, the demand for datacenters is higher than ever – as is the cost to build and maintain them.
All this combines to form the type of challenge that appeals to Microsoft Research teams who are experts at exploring out-of-the-box solutions.
Ben Cutler, the project manager who led the team behind this experiment, dubbed Project Natick, is part of a group within Microsoft Research that focuses on special projects. “We take a big whack at big problems, on a short-term basis. We take a look at something from a new angle, a different perspective, with a willingness to challenge conventional wisdom. So when a paper about putting datacenters in the water landed in front of Norm Whitaker, who heads special projects for Microsoft Research NExT, it caught his eye.
“We’re a small group, and we look at moonshot projects,” Whitaker says. The paper came out of ThinkWeek, an event that encourages employees to share ideas that could be transformative to the company. “As we started exploring the space, it started to make more and more sense. We had a mind-bending challenge, but also a chance to push boundaries.”
One of the paper’s authors, Sean James, had served in the Navy for three years on submarines. “I had no idea how receptive people would be to the idea. It’s blown me away,” says James, who has worked on Microsoft datacenters for the past 15 years, from cabling and racking servers to his current role as senior research program manager for the Datacenter Advanced Development team within Microsoft Cloud Infrastructure & Operations. “What helped me bridge the gap between datacenters and underwater is that I’d seen how you can put sophisticated electronics under water, and keep it shielded from salt water. It goes through a very rigorous testing and design process. So I knew there was a way to do that.”
James recalled the century-old history of cables in oceans, evolving to today’s fiber optics found all over the world.
“When I see all of that, I see a real opportunity that this could work,” James says. “In my experience the trick to innovating is not coming up with something brand new, but connecting things we’ve never connected before, pairing different technology together.”
Building on James’s original idea, Whitaker and Cutler went about connecting the dots.

Natick_secondary

Cutler’s small team applied science and engineering to the concept. A big challenge involved people. People keep datacenters running. But people take up space. They need oxygen, a comfortable environment and light. They need to go home at the end of the day. When they’re involved you have to think about things like landscaping and security.
So the team moved to the idea of a “lights out” situation. A very simple place to house the datacenter, very compact and completely self-sustaining. And again, drawing from the submarine example, they chose a round container. “Nature attacks edges and sharp angles, and it’s the best shape for resisting pressure,” Cutler says. That set the team down the path of trying to figure out how to make a datacenter that didn’t need constant, hands-on supervision.
This initial test vessel wouldn’t be too far off-shore, so they could hook into an existing electrical grid, but being in the water raised an entirely new possibility: using the hydrokinetic energy from waves or tides for computing power. This could make datacenters work independently of existing energy sources, located closer to coastal cities, powered by renewable ocean energy.
That’s one of the big advantages of the underwater datacenter scheme – reducing latency by closing the distance to populations and thereby speeding data transmission. Half of the world’s population, Cutler says, lives within 120 miles of the sea, which makes it an appealing option.
This project also shows it’s possible to deploy datacenters faster, turning it from a construction project – which require permits and other time-consuming aspects – to a manufacturing one. Building the vessel that housed the experimental datacenter only took 90 days. While every datacenter on land is different and needs to be tailored to varying environments and terrains, these underwater containers could be mass produced for very similar conditions underwater, which is consistently colder the deeper it is.

IMG_1298-Assembly-1024x768

Cooling is an important aspect of datacenters, which normally run up substantial costs operating chiller plants and the like to keep the computers inside from overheating. The cold environment of the deep seas automatically makes datacenters less costly and more energy efficient.
Once the vessel was submerged last August, the researchers monitored the container from their offices in Building 99 on Microsoft’s Redmond campus. Using cameras and other sensors, they recorded data like temperature, humidity, the amount of power being used for the system, even the speed of the current.
“The bottom line is that in one day this thing was deployed, hooked up and running. Then everyone is back here, controlling it remotely,” Whitaker says. “A wild ocean adventure turned out to be a regular day at the office.”
A diver would go down once a month to check on the vessel, but otherwise the team was able to stay constantly connected to it remotely – even after they observed a small tsunami wave pass.
The team is still analyzing data from the experiment, but so far, the results are promising.
“This is speculative technology, in the sense that if it turns out to be a good idea, it will instantly change the economics of this business,” says Whitaker. “There are lots of moving parts, lots of planning that goes into this. This is more a tool that we can make available to datacenter partners. In a difficult situation, they could turn to this and use it.”

Natick_out-of-water

Christian Belady, general manager for datacenter strategy, planning and development at Microsoft, shares the notion that this kind of project is valuable for the research gained during the experiment. It will yield results, even if underwater datacenters don’t start rolling off assembly lines anytime soon.
“While at first I was skeptical with a lot of questions. What were the cost? How do we power? How do we connect? However, at the end of the day, I enjoy seeing people push limits.” Belady says. “The reality is that we always need to be pushing limits and try things out. The learnings we get from this are invaluable and will in some way manifest into future designs.”
Belady, who came to Microsoft from HP in 2007, is always focused on driving efficiency in datacenters – it’s a deep passion for him. It takes a couple of years to develop a datacenter, but it’s a business that changes hourly, he says, with demands that change daily.
“You have to predict two years in advance what’s going to happen in the business,” he says.
Belady’s team has succeeded in making datacenters more efficient than they’ve ever been. He founded an industry metric, power usage effectiveness (PUE), and in that regard, Microsoft is leading the industry. Datacenters are also using next-generation fuel cells – something James helped develop – and wind power projects like Keechi in Texas to improve sustainability through alternative power sources. Datacenters have also evolved to save energy by using outside air instead of refrigeration systems to control temperatures inside. Water consumption has also gone down over the years.
Belady, who says he “loved” this project, says he can see its potential as a solution for latency and quick deployments.
“But what was really interesting to me, what really surprised me, was to see how animal life was starting to inhabit the system,” Belady says. “No one really thought about that.”
Whitaker found it “really edifying” to see the sea life crawling on the vessel, and how quickly it became part of the environment.
“You think it might disrupt the ecosystem, but really, it’s just a tiny drop in an ocean of activity,” he says.
The team is currently planning the project’s next phase, which could include a vessel four times the size of the current container with as much as 20 times the compute power. The team is also evaluating test sites for the vessel, which could be in the water for at least a year, deployed with a renewable ocean energy source.
Meanwhile, the initial vessel is now back on land, sitting in the lot of one of Microsoft’s buildings. But it’s the gift that keeps giving.
“We’re learning how to reconfigure firmware and drivers for disk drives, to get longer life out of them. We’re managing power, learning more about using less. These lessons will translate to better ways to operate our datacenters. Even if we never do this on a bigger scale, we’re learning so many lessons,” says Peter Lee, corporate vice president of Microsoft Research NExT. “One of the things that’s so fun about a CEO like Satya Nadella is that he’s hard-nosed business savvy, customer obsessed, but another half of this brain is a dreamer who loves moonshots. When I see something like Natick, you could say it’s a moonshot, but not one completely divorced from Microsoft’s core business. I’m really tickled by it. It really perfectly fits the left brain/right brain combination we have right now in the company.”

All that RaaS: saving lives and transforming healthcare economics

Stuart, a 66-year-old man with diabetes, felt lousy—constantly fatigued, nauseated, and short of breath after just the slightest exertion. His daughter, worried by his increasing frailty, took him to the emergency room at the local hospital. Her concern was amply justified: Stuart was suffering from heart failure. Like 5.1 million other Americans each year who suffer from heart failure, he was admitted to the hospital to treat this serious, often life-threatening condition. The caring medical team stabilized his condition, and Stuart left the hospital after 10 days, glad to be home with words of advice and a few medications. Within a month he was back, once again fatigued, and facing a second episode.

5582.RiskOMeter_MRCBlog2_496x330

Stuart’s story is far from rare. Hospital readmissions for chronic conditions such as diabetes, chronic obstructive pulmonary disease (COPD), and congestive heart failure (CHF) are both common and very costly. Studies conducted in the United States indicate that nearly 20% of Medicare patients who are hospitalized for chronic conditions are often readmitted within 30 days. Experts at Edifecs indicate that it costs Medicare—and US taxpayers—about $26 billion a year, and often a large majority of these readmissions are actually considered avoidable with accurate prioritization and personalized care protocols. Readmission-related costs have become so onerous that the Affordable Care Act includes financial rewards and penalties to deal with the readmission problem. Hospitals that reduce their readmission rates receive financial incentives; those that do not, lose reimbursement and get penalized.

Holistic tools that can reliably predict heart-failure readmissions—taking into account all aspects of each patient’s condition and risk factors—would significantly help patients and hospitals. The growth in the use of electronic patient records has recently offered the potential for such analysis, but little had been done to harness the collective intelligence contained in hospital patient records augmented with other data sources.
By introducing cloud computing technology and applying some of the latest advances in machine learning techniques, researchers are rapidly changing this situation.
One leading example of this is RaaS (Readmission Score as a Service), a platform that was developed by the University of Washington (UW) Tacoma’s Center for Data Science. RaaS compares a patient’s medical information to a database of heart-failure outcomes, using advanced machine learning techniques to arrive at a risk-of-readmission factor as well as corresponding actionable guidelines for the patient-provider team. Those patients identified with a high risk receive additional treatment: the goal is to reduce their likelihood of readmission and produce overall healthier outcomes across all stages of the patient care continuum.
The hundreds of machine learning models of RaaS are developed by using both the R machine learning language, and Microsoft Azure Machine Learning. This chronic care management predictive platform relies on historical patient data from multiple sources. These sources include anonymized electronic medical records, claims, labs, medications, and psycho-social factors, all labeled with observed outcomes that the machine learning models access and share in sync to provide continuous monitoring for personalized patient alerts.
RaaS is available as an on-premises service as well as via the cloud by using Azure Machine Learning web services and the Azure-based Zementis Adapa scoring engine to make predictions for patients. When deployed using Azure Cloud Services, RaaS performs data preparation at scale.
The UW Center for Data Science team began developing initial models in collaboration with MultiCare Health System in March 2012, using just two on-premises servers. The maintenance, frequent updates, and down times of these on-premises servers posed an ongoing problem, and scalability issues limited the scope of the project by affecting the speed of data exploration and machine learning.
About a year and a half ago, the team applied for and was awarded an Azure for Research grant, taking advantage of the Microsoft Research program that offers training and awards of computing resources to qualified institutions that use the cloud to advance scientific discovery. The award enabled the Center for Data Science team to scale up the project and create a robust prediction engine that generates a readmission risk factor score for patients at every stage of their hospital care: post-admission, pre-discharge, and post-discharge.
The RaaS platform at MultiCare Health enables the care management team to view an electronic dashboard that shows heart-failure patients’ risks of readmission. UW Medicine Cardiology is now collaborating with the Center for Data Science team to study the efficacy of predictive models for augmenting care management guidelines by using machine learning.
—Daron Green, Deputy Managing Director, Microsoft Research
—Gregory Wood, MD, UW Medicine Cardiology

Predicting ocean chemistry using Microsoft Azure

LiveOcean_WebGraphics_MRCBLOG_900x300

Shellfish farmer Bill Dewey remembers the first year he heard of ocean acidification, a phrase that means a change in chemistry for ocean water. It was around 2008, and Dewey worked for Taylor Shellfish, a company that farms oysters in ocean waters off the coast of Washington. That year, thousands of tiny “seed” oysters died off suddenly. Today, a cloud-based predictive system from the University of Washington (UW) and Microsoft Research may help the shellfish industry survive changing conditions by providing forecasts about ocean water.
Dewey, director of Public Affairs for Taylor Shellfish, vividly remembers walking into a conference room where an audience of shellfish farmers first heard that ocean acidification might threaten their industry profoundly. They learned that increased carbon dioxide in the atmosphere is making ocean water more acidic. In 2013, the Washington legislature stepped in and asked the UW to study and build a predictive forecast model, aptly named, LiveOcean.
Just like a numerical weather forecast model, LiveOcean will soon provide a forecast that predicts the acidity of water in a specific bay, part of Puget Sound or other coastal regions, days in advance.
Parker MacCready, a professor of physical oceanography at UW, is the scientist leading the LiveOcean team and used Microsoft Azure to create the cloud-based storage system. The system holds enormous amounts of data from his remote ocean model, the Regional Ocean Modeling System (ROMS), which helps feed the LiveOcean models. The Azure component uses Python and the Django web framework to provide these forecasts in an easy-to-consume format. To produce these forecasts, the LiveOcean system relies on other sources: US Geological Survey data (for river flow), atmospheric forecasts, and another ocean model called HYbrid Coordinate Ocean Model (HYCOM).

Dewey needs information on the acidity levels because a baby oyster needs to create a shell immediately to survive, and needs carbonate ions in the water to make that first tiny shell. If the water is too acidic, the baby oyster must use too much energy and dies in its attempt to make that first shell. Taylor Shellfish has hatcheries for the baby oysters and “planting” beds where young oysters are carried to grow to full size. Forecasts of water acidity in both places would help the company know when it was safe to hatch the babies, and where (and when) it is safe to plant them.
Ocean acidification is an emerging global problem, according to the National Oceanic and Atmospheric Administration (NOAA). Scientists are just starting to monitor ocean acidification worldwide, so it is impossible to predict exactly in what ways it will affect the marine environment. In a report, NOAA wrote, “There is an urgent need to strengthen the science as a basis for sound decision making and action.”
Azure tools make the system open to anybody. MacCready is eager to see how others develop sites pulling data on water currents for kayakers, for example, or information for salmon fishers. He is particularly excited about “particle tracking,” which helps him see where individual particles in the ocean move. That tracking could predict where an oil spill might move, for example. Using the cloud is “the way of the future” from his scientific perspective. “It gives the ability to create and use different resources without having to go out and buy hardware yourself.”
Fine-tuning and testing is essential to the reliability of the predictions. In recent years, MacCready and others have been validating the forecasts that LiveOcean is making. They pair real observations from physical instruments to predictions. Within months, he hopes to refine forecasts down to the level of individual bays, so that he can tell Dewey whether Samish Bay or Willapa Bay, for example, is “safe” for the new oysters.
LiveOcean has impacts far beyond just the shellfish industry. Jan Newton, principal oceanographer at the Applied Physics Laboratory, is the co-director of the Washington Ocean Acidification Project (WOAP), believes it may change how the public sees climate change and ocean chemistry.
“Data portals and models like LiveOcean can really make a bridge [of understanding] because even if people don’t understand the chemistry, they’ll look at the color-coding and see how this changes with location and season,” she said. Dewey believes that these tools for the Pacific Ocean chemistry will be adopted by others for oceans worldwide.

Marketing agency improves technology, saves $87,000 with cloud-based telephony

For BDSmktg, its field staff is the core of its business, with only a small percentage of employees at headquarters. BDSmktg is using Skype for Business Online in Microsoft Office 365 to knit these two groups more closely together, accelerate business, and save bundles of money. With Skype for Business Online, BDSmktg will save US$87,000 annually in personal phone charge reimbursements, audio conferencing fees, and PBX maintenance, and avoid the need to spend $250,000 on a new PBX system.

Flustered by phones

James Metcalfe never imagined that the most troublesome technology in his company would be the most mundane: phones.
James Metcalfe is Director of IT Network Infrastructure for BDSmktg, an agency that provides retail marketing services for world-class brands by representing their products and services in stores. The Irvine, California¬–based agency provides thousands of representatives each year to some of the biggest names in retail.
James Metcalfe had already outfitted several hundred of the agency’s full-time employees with Microsoft Office 365 to give them anytime, anywhere, any-device access to email, document storage, document sharing, and web conferencing. Employees used the latest PCs, laptops, tablets, and smartphones.
But old-fashioned phone communications posed a growing problem. Only a small percentage of BDSmktg employees work at the Irvine headquarters, while thousands work in the field—from home or on the road—because their jobs require that they be near the stores they service.
A significant portion of the company’s large recruiting team and extensive field staff used their personal phones to conduct business, and BDSmktg reimbursed them for the charges. But this was expensive and problematic. When job candidates returned calls to recruiters, they could end up talking to a recruiter’s family member. Or, if recruiters or field operations managers left BDSmktg and went to work for a competitor, they took job candidates’ phone numbers with them.
“There were delays in tracking down phone numbers to reach colleagues, which slowed down the business,” Metcalfe says.
In the Irvine office, the company’s private branch exchange (PBX) system was old, out of date, and hemorrhaging money. “Every time we had budget talks, the PBX system came up, but sticker shock ended the discussion,” Metcalfe says. “The timing was never right to make the large investment to replace or upgrade it.”

One way to connect everyone

In late 2015, BDSmktg asked to be part of a Microsoft early adopter program for a new version of Skype for Business Online (part of Microsoft Office 365) that included significant telephony enhancements. Cloud PBX and PSTN Calling provide software-based PBX functionality with a bank of Voice over Internet Protocol (VoIP) phone numbers. PSTN Conferencing allows people invited to a Skype for Business Online meeting to join by dialing in over a landline or mobile phone (rather than the Internet).
BDSmktg gave Skype for Business Online to about 300 of its employees, and adoption was instant and enthusiastic. “We’ve been using Lync Online for years, so our staff already had experience with chat, screen share, and video and web conferencing,” Metcalfe says. “Adding PSTN Conferencing and PSTN Calling just makes communications even simpler. With Skype for Business Online, we have one way to connect everyone, wherever they are, whatever device they’re using, and whether they’re connected to the Internet or not.”

More professional, more accountable

Today, BDSmktg employees who work from home have an assigned Skype for Business Online phone number that they use for work calls; no more giving out personal phone numbers. When an employee leaves BDSmktg, there’s no longer the worry that a personal phone number is a contact’s only link to the company. BDSmktg simply reassigns the Skype for Business Online phone number to a new employee, maintaining continuity with client and job candidate communications.
“With PSTN Calling, we can track every inbound and outbound call, see the number called, and the duration of the call,” Metcalfe says. “We have much better accountability around a critical part of our business.”

Work effectively from anywhere

Employees working from home now feel better connected to the company because they can connect quickly with colleagues. “We’re able to provide more seamless communication for our employees who work from home,” Metcalfe says. “People are blown away by the quality of the HD Voice in Skype for Business Online. They don’t want to go back to regular phones.”
BDSmktg management likes the flexibility that the new features provide. “With Skype for Business Online, we have more freedom to place people wherever the business needs them to be, rather than having technology limitation determine employee access,” says Ken Kress, President of BDSmktg.

Huge savings

Management also likes the savings. By using Skype for Business Online for field staff telephony, BDSmktg eliminates the need to reimburse employees for calls made from personal devices—a US$12,000 annual savings.
By replacing the $8,000-a-month licenses from its current conferencing provider with a $1,700-a-month Skype for Business Online subscription, BDSmktg will save $75,000 annually.
And by replacing its physical PBX with Cloud PBX, BDSmktg will avoid a $250,000 replacement cost and ongoing maintenance costs of $35,000 a year.
Last but not least is the real estate cost avoidance that BDSmktg could realize by using Skype for Business Online. “We’ll avoid significant costs to expand our office as our company grows as we enable more people and roles to work from home,” Metcalfe says.

Easy to manage

From Metcalfe’s perspective, having telephony functionality bundled with Office 365 makes his life easier. He eliminates the work and expense of a physical phone infrastructure. It’s far easier to move employees around the office and to move them from office to home. “Scaling up and creating additional phone numbers with PSTN Calling is very straightforward,” Metcalfe says.
There are fewer vendors and bills to manage. More services on user desktops are connected and interoperable, making support easier. “Giving employees new capabilities and saving money is what a successful IT department strives for,” Metcalfe says. “I’ve been championing a new phone system for three years, and to finally find a solution that is affordable, easy to implement, and easy to use is a game changer.”

Next, extend to every field employee

Metcalfe’s vision is for all the company’s thousands of field staff representatives to have access to Skype for Business Online and other Office 365 services. The above-mentioned savings could well make this possible.
“It would be ideal for our field operations managers to easily and instantly connect with the representatives that they manage,” Metcalfe says. “Everyone would have the Skype for Business Online mobile app on their smartphones. As our field programs ramp up and down, we adjust our Office 365 subscriptions as required using a central admin portal. It would make us more nimble, more responsive, and more competitive than ever.”

Contact us Today!

Chat with an expert about your business’s technology needs.