Digital transformation is being driven by Indian companies such as Flipkart (the country’s leading marketplace) and Tata Motors, as well as Korean companies such as LG Electronics, Samsung Electronics and Asan Medical Center, through the leveraging of Microsoft cloud.

Azure is available from two new cloud regions in Korea, part of 38 Azure regions around the globe which is more regions than any other cloud provider. 13 of those regions are in Asia which gives customers across the region the opportunity to leverage the Azure cloud platform’s ability.

Across industries, including finance and health care, Korean companies are putting local Azure services to use in their organizations. LG Electronics is sending real-time data and using the scalability of virtual machines in order to better serve their customers and Asan Medical Center is able to collaborate with the industry & academia in order to supply anonymous clinical notes through Microsoft's hybrid cloud.

Other Korean companies, such as Samsung Electronics, are leveraging the Internet of Things (IoT). Using its remote energy reduction solutions (S-Net Cloud), Samsung is able to monitor energy use and deliver efficiencies to customers in order to save them money on energy costs.

Data center regions in India opened in September 2015 have provided the immense computing power of Microsoft cloud for the purposes of growth and innovation. Flipkart has adopted Azure as its exclusive public cloud platform to do just this, grow and innovate. The collaboration between Microsoft and Flipkart has marked the process towards providing customers with the best online shopping experience possible, aided in addition by Flipkart's use of Azure Artificial Intelligence (AI) and analytics to optimize the areas of merchandising, marketing, and customer service. This partnership comes on the heels of Microsoft's cloud collaboration with Tata Motors – India’s leading auto manufacturer – to provide connected driving experiences with Azure.

People and organizations across the globe are embracing the cloud to solve unique challenges. You can follow these links to learn more about the new regions in Korea, how Flipkart is using Azure, case studies of how people and organizations are using the cloud for innovation, and to learn more about Azure services and solutions.

5 steps to a solid disaster recovery plan

Cloud Security shutterstock
If your business was about to be destroyed by fire, and you had one minute to save one file, what would it be?
I’d guess not your pictures of Fluffy the cat. But maybe your payroll data or customer order list. A Disaster Recovery strategy defines which data you will save first and what will be available during planned or unplanned downtimes. It also plans for the data you can live without. Poor Fluffy.

Complexity vs. Costs

When you create your Disaster Recovery plan, you’ll need to weigh the trade-offs between complexity vs. costs. What data can you afford to be without? For how long? If you lost some data, would that destroy your business forever?

Five parts of a disaster recovery plan:

1. Recovery Point Objective (RPO). RPO defines how much data you are willing to lose. You can give higher priority to your most critical data, but be willing to lose less important data, such as pictures of Fluffy. Customer records might be top of your list, while marketing data might rank lower.
2. Recovery Time Objection (RTO). RTO weighs how long you are willing to be without your data. Depending on your business, you might decide that you can lose up to two hours of business operation. A shorter time will create higher costs, so you’ll need to consider your options carefully.
3. Personnel. Who should get their data back sooner? Who will support the plan? Do you have a backup person as well as backup technology? Is your plan dependent on human intervention, which may not be possible in all cases?
4. Regulatory constraints. Is your business subject to regulatory compliance? How will you make sure you are covered?
5. Critical data. Which data is critical to your business? What are the dependencies between different areas of the business?

Test and train

Often companies will create a plan, and then leave it on the shelf. They don’t fully test the plan, or consider multiple scenarios. When a disaster hits, whether it’s cybercrime or a hurricane or a rogue sprinkler system, the plan fails. The New York Stock Exchange had a plan before Hurricane Sandy, but they didn’t follow it when disaster hit. Instead, they closed the stock exchange for two days.
Your resources and business needs will change over time. This includes your location, personnel, and data. Testing your plan two to three times a year is one way to make sure the plan is up-to-date and still supports your current business goals.
Once you have a plan in place you’ll need to train all personnel. For a higher chance of success, ensure that senior management endorses the plan and promotes training for all employees.

Get help to create a plan

A cloud solution can help you find a good balance between cost and complexity. With Azure Site Recovery, you can easily create disaster recovery plans in the Microsoft Azure portal. The disaster recovery plans can be as simple or as advanced as your business requirements demand.
We’re here to help you with all stages of strategy, planning and implementation.


Did you take a wrong turn on the road to disaster?

Broken Down Car shutterstock
Can you find your way back if you take a wrong turn on the road to disaster? If disaster strikes, recovering from it could cost you thousands of dollars, plus lose your customer’s good faith.
Having a plan to save your business from bad things will help you get back on track.
Most businesses rely on technology to run their day-to-day processes. Think email, Word docs, customer information and ordering systems, inventory, accounting. Other companies are all about the tech. Think Uber, Airbnb, Constant Contact and many others.
So, if something disastrous happens, your business could grind to a halt, whether tech is your main business or ‘just’ how you get your work done every day. What could go wrong? Well, there's floods. Plus, electrical storms, hurricanes, fire and people who leave with your passwords or source code. Without a backup plan, your business could be in trouble.

What’s included in a plan?

Disaster plans can cover everything from how to get out of your store or office during a fire drill, to how to get back up and running if your servers are underwater.
Then, you can take it a step further. If you think about staying in business during a disaster, as well as recovery, you’ll be in the best shape you could be.
While it’s important to customize a plan for your business, every plan should include:
* Technology asset inventory that names mission critical processes and data
* Schedule for updating and testing any disaster recovery plans
* Clear understanding of the trade-offs between cost and complexity

Murphy’s Law

Murphy’s Law says that whatever can go wrong, will go wrong. That’s why it’s important to understand how your plan works. If you are a business decision maker, you might hand this over to your IT team. But it’s important to ask some questions to make sure you have full coverage for your business. A few questions:
* Does your plan include an inventory of mission critical business processes and data?
* When was the last time anyone reviewed your plan? Tested your plan?
* Is cyberattack preparedness included in your current plan?
* How much depends upon human intervention?

Evaluate Cloud Solutions

During a disaster, humans have other priorities than failing over their virtual machines. Automating your solution is key to ensuring success.
A cloud solution can help you recover quickly. And it’s less expensive than having your own datacenter to support and protect. It’s a practical solution for a business, whether it’s large or small. But it makes especially good financial sense for a smaller organization.
Not all cloud providers are equal, so you’ll need to do some research to compare. A few considerations:
* Do they offer a hybrid solution, so that you can keep some data on premise as well as in the cloud?
* Do they offer metered service so that you can save even more money by ‘turning off’ services when you don’t need them?
* Is the service easy to use, with good support for your team?
* Do they offer geo-redundancy?
* Are they compliant with your industry?

Is the cloud safe?

But wait a minute, you might say. I’ve read about companies, even big companies, losing data in the cloud during a disaster.
It’s true, there have been times when the cloud failed companies. When this happened, it was because the data was only stored in one location. And it was based in the same region as the company. That’s an obvious mistake.
With the Microsoft cloud, you can get ‘geo-redundancy.’ This means that your data is in more than one location. So, if your area is hit with a hurricane, along with floods and electrical storms, your data would be safe in a datacenter across the country.
That also means that your company data is available even during the storm.

Steps to take

1. We’d love to meet you and discuss your plans for keeping your business running. If you’re ready right now, call us or send an email. We’re happy to set up a free consultation to review your plan.
call: 858-429-3000


Subscribe to our industry newsletter for more insights on how to help your business.


cloud - managed solution

Aluvii stands apart from the competition by offering cutting edge data visualization with Power BI Embedded

By The Power BI Team as written on Microsoft Corporation
The Power BI Team is excited to highlight a new story blog series of notable ISVs who have integrated Power BI Embedded into their offerings to differentiate and innovate their solutions. Our immediate story post comes from Aluvii, which offers SaaS POS solution for amusement parks and leisure facilities. With Power BI Embedded, Aluvii aims to stand apart from the competition, bringing rich data visualization to life all inside their application offerings. How did they integrate Power BI Embedded into their product offering? Read more below, and stay tuned for additional story posts coming your way.
Aluvii, an all-in-one POS software platform for the amusement and leisure industries, recently released their flagship SaaS product and are experiencing a very positive response in the marketplace. The cloud-based software includes a comprehensive set of modules needed to efficiently run an entire business including ticketing, point of sale, e-commerce, memberships, events & reservations, inventory, HR, scheduling & timekeeping, sales & marketing, member portal, online waivers, and much more. Because Aluvii is cloud based, it’s accessible anytime, anywhere, and on any device. In addition, all registers, customer data, and reports are always available, safe, and up to date.
We at Aluvii recognize automated business intelligence and reporting are critical to customers. As such, we selected and utilize Microsoft Power BI Embedded as our cutting edge reporting and dashboard technology solution.
The primary reason for our selection of Microsoft Power BI Embedded was the need for a single reporting platform that could be deployed across multiple operating systems, web portals, and applications. Microsoft’s Power BI is an excellent platform for the deployment of dashboards and reports, but it lacked the flexibility of embedded application inclusion. With the release of the Power BI Embedded solution, we were able to leverage everything we loved in the standard Power BI framework into an integrated product. We researched other solutions but each did not truly allow for embedded application deployment. And many required convoluted integration architectures that had greater possibility of breakage. Since Aluvii is built on top of Azure, the Power BI Embedded solution was seamless, and drove down the total cost of ownership.
The same architecture that allows for rapid deployment of reporting is also being managed through the Azure API Management services. This allows our more advanced clients to build their own custom Power BI Desktop reports and dashboards. With the API Gateway created by the Azure API Management, API Management, Aluvii is able to setup subscription models that provide access to data and other features. Billing clients is also streamlined through this process.
Our development team has leveraged Microsoft Azure as its PaaS, the core API architecture used by the company which has allowed for increased flexibility in development and deployment. And by kick starting the business using the Microsoft BizSpark program, we were able to test and utilize the best of Microsoft’s product line, ensuring a state-of-the-art solution at affordable prices.
The Azure SQL Database product, which is used as the backbone of the Aluvii Software Suite, allows for quick integration within Power BI Embedded. The write-once-deploy-everywhere software process ensures that our reports can be integrated within our web portals, desktop applications, and mobile platforms with minimal development and cost. Microsoft Power BI’s flexible visualizations offers our Aluvii product the ability to develop truly unique reporting solutions across allof our various feature sets and using multiple data sources. At this time, Aluvii is unique in providing dynamic, multi-tenant reporting services built into the core application among all of its other software features.
Overall, Aluvii credits their growing success to its diverse product offerings, the scalability of the Microsoft Cloud Azure Platform, and innovative development teams. Since the initial release of their flagship SaaS offering, Aluvii has received tremendous interest from customers across the globe. The Power BI Embedded reporting feature allows Aluvii to differentiate itself from its competitors, granting deep insights into its customers’ operations.


The SwimTrain exergame makes swim workouts fun again

By Miran Lee as written on

To many who swim for exercise, workouts come down to the monotony of doing laps—swimming back and forth in a pool. Over and over. Unlike other exercisers, who can make their routines less of a chore by adding a social component—working out with friends, family, or in groups—swimmers really haven’t had many options, because coordinating a group of swimmers is difficult. The Korea Advanced Institute of Science and Technology (KAIST) and Microsoft Research Asia (MSRA) are happy to report that with SwimTrain, their new cooperative “exergame” research project, you’ll never have to swim alone again.
SwimTrain is the result of a research collaboration between KAIST and MSRA. The project targets something we can all relate to: exercise boredom. Swimming, while one of the best ways to get fit, can be tedious. The SwimTrain team thinks they have a way to make swimming a lot more exciting.


How does SwimTrain work? First, you slip your phone into a waterproof case and plug in some waterproof headphones. Then, you jump in. Players get matched up as a team to form a virtual “train,” with each player controlling the speed of a single train compartment. Go too fast or too slow, and the game warns you of bumping into other compartments. Featuring narration, vibration feedback, spatialized sound effects, and background music, the immersive experience takes players through different modes of gameplay based on an interval training workout plan.
Each SwimTrain round consists of three phases:
Phase 1: Compartment ordering
Compartments race against other compartments. A compartment is ranked based on a swimmer’s average stroke speed during the race.
Phase 2: Train running
Compartments are placed along the same track and run in a circle (like a merry-go-round). To earn points, each swimmer must maintain their current stroke rate with the target stroke rate established in the previous phase. A compartment shifts with the movement of the current stroke rate relative to the target stroke rate, and it should travel without crashing into adjacent compartments.
Phase 3: Train stop
The virtual train stops. Every swimmer takes a short rest. The game narrates the final ranking of the current round and information for the next round, such as the duration of each phase and recommended stroke types.
SwimTrain accomplishes immersive gameplay by relying on advanced tech packed into a mobile phone. The barometer, accelerometer, gyroscope, and magnetometer track swimming activities, determining swimming periods, stroke, style, speed, and other events. This information is fed to a Network Manager based on the Microsoft Azure cloud, and is then delivered back to the game as rank and round data, determining the status of the player in relation to the train. It’s also passed to a Feedback Manager, which provides the auditory and sensory feedback that make SwimTrain unique.
Preliminary feedback from users is positive—SwimTrain makes you feel like you’re not alone in the pool. According to one test user, “Although [SwimTrain] didn’t provide any visual feedback, I felt like I was swimming with others.” Feedback is also indicating that SwimTrain is providing an immersive and enjoyable experience that’s intense workout, too.

The project team’s research is getting noticed in the world of human-computer interaction (HCI). CHI 2016, the world’s top conference for HCI, has accepted the team’s research for inclusion in the CHI 2016 Notes and Papers Program.
This collaboration with KAIST is a great example of how Microsoft values symbiotic relationships with partners in academia. “Not only do we have the ability to shape the future of Microsoft products, we have the chance to support and learn from some of the top professors in computer science,” said Darren Edge, lead researcher at MSRA. Many of these collaborations lead to internships. “When a student makes a particularly promising contribution to a joint project, we can also invite them to spend time at Microsoft as a research intern. Everybody wins from such internships: we get some of the brightest PhD students to work on our projects, and the students develop new expertise and skills that they can apply to their university work with their professor.”
Darren explains that this recently happened as a result of his ongoing collaboration with Professor Uichin Lee at KAIST. Following the completion of work on SwimTrain, Professor Lee’s PhD student Jeungmin Oh joined Darren at MSRA for a six-month internship, working in another area. “We are all now collaborating on multiple projects in parallel. If any of them are as successful as SwimTrain, which won the third place award at the recent Microsoft Korea and Japan Day and has two accepted papers pending publication, I will be very happy indeed,” he states.
The MSRA HCI group has in fact had a longstanding collaboration with academia: In recent years, MSRA has supported principal investigators for projects published at CHI 2014, CSCW 2015, and CHI 2016.
In the future, SwimTrain will focus on measuring more data, such as heart rate and maximal oxygen uptake, to determine the exertion level of a player’s swimming. Also, the method might be applied to other group exercises, such as group jogging and group cycling. We look forward with anticipation to what SwimTrain might inspire.



Aiming to Deliver New Drugs Faster at Less Cost in the Cloud


Researchers from Molplex, a small drug discovery company; Newcastle University; and Microsoft Research Connections are working together to help scientists around the world deliver new medicines more quickly and at lower cost. This partnership has helped Molplex develop Clouds Against Disease, an offering of high-quality drug discovery services based on a new molecular discovery platform that draws its power from cloud computing with Windows Azure.
Rethinking Drug Discovery
David Leahy, co-founder and chief executive officer of Molplex, envisions a way to help pharmaceutical researchers anywhere in the world form effective drug discovery teams without large investments in technology or fixed running costs. "It takes massive computing resources to search through chemical and biological databases looking for new drug candidates. Our Clouds Against Disease solution dramatically reduces the time and cost of doing that by providing computation and chemical analysis services on demand," Leahy says.
Molplex regards drug discovery as a big data and search optimization problem. Clouds Against Disease uses its computational molecular discovery platform to automate decision making that is traditionally the scientists’ task.
"Instead of having teams of scientists scanning chemical information, our software searches for structures that have multiple properties matching the search criteria," explains Leahy. "When we integrate that with highly automated chemical synthesis and screening, it becomes a much more efficient and productive way of doing drug discovery."
Data Manipulation on a Larger Scale
In a recent pre-clinical study, the company applied its computational platform to more than 10,000 chemical structure and biological activity data sets. This action generated 750,000 predictive relationships between chemical structure and biological effect. After generating numerous possible outcomes, Molplex then used the same validation criteria that scientists would use to narrow down the 750,000 relationships to just 23,000 models covering 1,000 biological and physico-chemical properties, a relatively small data set that humans could then manage. "It would have taken hundreds of scientists several years to do this the conventional way," Leahy
Windows Azure was critical to the success of Clouds Against Disease. Molplex can access 100 or more Windows Azure nodes—in effect, virtual servers—to process data rapidly. The physical-world alternative would be to source, purchase, provision, and then manage 100 physical servers, which represents a significant investment in up-front costs. Before they could begin drug research, scientists taking this traditional approach would have to raise millions of dollars, but Windows Azure helps eliminate start-up costs by allowing new companies to pay for only what they use in computing resources.
Vladimir J. Sykora, co-founder and chief operating officer for Molplex, explains that the Molplex computational platform runs algorithms his company developed to calculate the numerical properties of molecules rapidly. Consequently, Molplex has been able to produce drug discovery results on a much larger scale than what was previously feasible. "We would not have been able to predict so many compounds without the cloud computing resources enabled by Windows Azure," asserts Sykora. "The speed and high level of detail provided by Windows Azure allow us to explore far beyond what would have been possible with traditional hardware resources."
Fighting Tropical Diseases
Molplex is embarking on a new collaboration with the Malaysian government to search for drugs that fight tropical diseases. This search has always been a lower priority for drug companies because the market is smaller, making it a less desirable commercial prospect. The traditional drug discovery program is geared to $1 billion a year blockbuster drugs; however, there are fewer opportunities today for drugs with that level of commercial potential.
Increasingly, scientists are researching tropical diseases that affect smaller populations; radically reducing the cost of drug discovery makes it feasible for scientists to tackle them. "Unlocking drug discovery technology from a physical location with the cloud has tremendous potential to help researchers work on curing these diseases faster and at less cost," asserts Leahy, "wherever they are in the world."


Microsoft Makes It Easier To Move SQL Server Licenses to Azure


By Kurt Mackie as written on

Organizations with Enterprise Agreements (EAs) can now move their existing SQL Server licenses to Azure Virtual Machines using prebuilt images, Microsoft announced late last week.
Microsoft currently sells SQL Server use on Azure Virtual Machines on a "pay per use" licensing basis, but organizations can now bring their own license and tap Azure infrastructure if their SQL Server licenses are covered under an EA. To do that, an organization just selects a prebuilt image from the Azure gallery and then gets charged for the Azure compute costs, per Microsoft's announcement:

Starting this week, customers with Enterprise Agreement who already have SQL Server Licenses, can use them on Azure Virtual Machines with Microsoft-certified (BYOL) gallery images. These images will not charge for SQL Server licensing, just for compute cost.

The image selection process is demonstrated briefly in this Microsoft video. Organizations "don't get billed per hour," per that video, although they do for the compute-time component. Microsoft offers documentation on the Azure Virtual Machine provisioning process at this page.
The deal isn't for small organizations. An EA is a licensing program for organizations with 250 or more users or devices right now, but that minimum limit will get bumped up to 500 users or devices starting on July 1, 2016 for new contracts.
Microsoft also has a so-called Azure "license mobility" program that applies to organizations with server licensing covered under Software Assurance (SA) agreements. An SA is an extra-cost annuity agreement on top of a software license that permits software upgrades within the SA contract period. Organizations with SA agreements also can tap Azure infrastructure using their existing server licenses, although they have to bring their own images under that plan, explained Wes Miller, an analyst with independent consulting company Directions on Microsoft, based in Kirkland, Wash.
"They (SA customers) do have that option," Miller said, via an e-mail. "But note that to do so they have to bring their own images. This option also lets them start with images from the gallery and not get charged for SQL, just compute."
In response to a question, Miller said Microsoft's new EA deal isn't bearing any extra costs other than the Azure compute costs. There aren't any Client Access License charges as "Azure generally doesn't have the concept of CALs," he noted.


The next phase of Microsoft Academic: intelligent bots at your service!

By Kuansan Wang as written on


Progress in AI research and applications is exploding, and that explosion extends to our own team working on academic services. Continuing our work supercharging Bing and Cortana, we are also applying new technologies to Microsoft Academic, which serves the research community. If you’re not familiar with Microsoft Academic, this online destination helps researchers connect with the papers, conferences, people, and ideas that are most relevant, using bots that read, understand, and deliver the scientific news and papers researchers need to further their work.
Designed by and for researchers like myself, the site puts the broadest and deepest set of scientific information at your fingertips, with the ability to go beyond keywords to the contextual meaning of the content. Recently, we further enhanced the analytic content so users can see the latest research, news, and people, ranked by importance and credibility. Users can even drill down on the people, events, and institutions they care most about.
Behind the scenes, we are taking advantage of the fact that machines do not require time to sleep or eat, and have superior memory to humans. We have trained our AI robots to read, classify, and tag every document published to the web in real time. The result is a massive collection of academic knowledge we call the Microsoft Academic Graph (MAG), which is growing at roughly 1 million articles per week. While one set of robots is busy gathering knowledge from the web, another set of robots is dedicated to analyzing citation behaviors and computing the relative importance of each node in the MAG so that users are always presented with information they need and want.
Microsoft Academic is based on the work our team developed for Microsoft Cognitive Services, including open APIs that give developers AI-based semantic search tools and entity-linking capabilities. We’re also applying AI semantic search—which is contextual and conversational—to Cortana, Bing, and more.
As a research organization, we understand the pivotal role that open communication plays in advancing science. As such, we’re making the back-end dataset and algorithms available to all through Cognitive Services. There, everyone can access and conduct research on the massive and growing dataset through the cloud-based APIs. This means you don’t have to worry about the logistics of transmitting the massive dataset over the Internet, or manage a cluster of computers just to host and analyze the data. We are particularly excited that the research community has taken advantage of these cloud resources and already is collaborating on a common data and benchmarks platform to advance the state of the art. Earlier this year, we saw 81 teams participate in the WSDM Cup 2016 to develop new methods to rank papers, including newly published ones that have yet to receive any citations. An ongoing challenge is the KDD Cup 2016, which is focused on finding a better way to rank the importance of research institutions. The results of the first two stages of the contest have already been published, and I cannot wait to see the final outcomes and learn what new insights and technologies the 500 participating teams have developed when results are announced in August at KDD 2016 in San Francisco!
I encourage you to start experiencing the breadth and depth of what Microsoft Academic currently has to offer and to continue this journey with us in our mission to empower every academic and every academic institution on the planet to achieve more.


Contact us Today!

Chat with an expert about your business’s technology needs.