[vc_row][vc_column][vc_column_text]

Solving global water challenges with Microsoft cloud technologies

We’ve been warned for decades to improve water management, and recent events such as the California drought are currently keeping the issue in the news. We take action at home by not watering our lawns and spending less time in the shower. But although personal conservation is a step forward, it’s only part of the solution. The morning cup of coffee after your three-minute shower took 55 gallons of water to produce. The shirt you’re wearing commanded another 700 gallons of water to create, and the car you drove to work required a staggering 39,090 gallons to build. Fortunately, Ecolab—a leading global provider of water, hygiene, and energy technologies and services—is on a mission to make changes that enable the goods and services we rely on every day to be produced and delivered with a lot less water.

Microsoft Azure and IoT services have helped us get much closer to our ambition to help customers operate at water-neutral. To get there, we need to collect massive amounts of information. Now, we can identify the opportunities and gaps, provide the right solution, and most importantly, manage our customers’ processes so that they can get closer to net-zero water usage.

Christophe Beck: Executive Vice President and President. Nalco Water, an Ecolab company.

We’ve been warned for decades to improve water management, and recent events such as the California drought are currently keeping the issue in the news. We take action at home by not watering our lawns and spending less time in the shower. But although personal conservation is a step forward, it’s only part of the solution. The morning cup of coffee after your three-minute shower took 55 gallons of water to produce. The shirt you’re wearing commanded another 700 gallons of water to create, and the car you drove to work required a staggering 39,090 gallons to build. Fortunately, Ecolab—a leading global provider of water, hygiene, and energy technologies and services—is on a mission to make changes that enable the goods and services we rely on every day to be produced and delivered with a lot less water.

Addressing the world’s water challenges

The United Nations’ 2015 World Water Development Report predicts that by 2030 demand will outpace supply by almost 40 percent and two-thirds of the world population could be under stress from lack of fresh water. Under pressure to operate more sustainably, governments and industries worldwide are turning to Ecolab for help. The company protects vital resources by helping companies achieve net-zero water usage—producing goods with infinitely recycled water—and providing access to fresh water for more people.
With more than 47,000 employees working across 170 countries and multiple industries, Ecolab needed better solutions to collect data from more than 36,000 water systems in customer operations to provide deeper insight and deliver more value. “Our goal of helping customers reduce water usage requires that we capture real-time information from processes anywhere around the world,” says Christophe Beck, President, Nalco Water, an Ecolab company. “We need to be able to control processes remotely and deliver the intelligence that enables our service personnel in the field to manage those processes for optimal performance results.”

Improving insights and management through the cloud

Ecolab chose the Microsoft cloud to enhance its ability to deliver personalized services and drive innovation. The company is taking full advantage of the Microsoft Azure platform, including the Azure IoT Suite, to accelerate water scarcity solutions for global industries. The cloud platform also includes Microsoft Dynamics 365, Microsoft Power BI, and Microsoft Office 365 to provide a seamless solution for collecting, analyzing, and sharing information across multiple locations worldwide.
Ecolab and Microsoft worked closely together to deliver water management solutions on a much larger scale, and at a much deeper level, than previously possible. “We’re touching most of the vital operations within a plant from pretreatment to production processes through waste water,” Beck says. “And then you have all of the steps of cooling and boiling water. Connecting and controlling all those operations within a plant to reduce water as a whole is the real challenge. That’s why we needed to work closely with Microsoft to leverage the right portfolio of technology solutions.”
The company wanted to drive change from two directions: gathering and analyzing data to improve production processes, and then using the aggregated data to demonstrate to customers that its solutions make sound business sense.
In a recent scenario, a company with 262 plants worldwide and more than 250,000 employees wanted to become water-neutral—or in other words, they wanted to recycle all of the water coming out of their plants back into the production process. But to make a new conservation plan feasible, Ecolab needs to show potential customers that the investment will be financially as well as environmentally sustainable, with a 100 percent return on investment.
It all starts with data. Inside production facilities, equipment based on Ecolab’s Nalco 3D TRASAR™ Technology sends data to a highly secure analytics and storage platform on Azure. The technology monitors and controls streams for water-intensive processes and collects and analyzes the data in real time. “Microsoft Azure and Azure IoT Suite have helped us get much closer to our ambition to help customers operate at water-neutral,” says Beck. “To get there, we need to collect and analyze massive amounts of information. Now, we can identify opportunities and gaps, provide the right solutions, and most importantly, manage our customers’ process so that they can get closer to net-zero water usage.”
The information Ecolab collects from 36,000 water systems in more than 100 countries, spread across five continents, provides actionable intelligence that can be used to benchmark performance and drive continuous improvement. Shared through the cloud, the data is accessible to Ecolab service personnel through mobile devices such as Microsoft Surface Pro.
Ecolab associates use the information to engage customers more proactively and make informed recommendations to improve processes. The company’s field personnel also can use the Microsoft solution to effectively quantify and communicate the return on investment a customer achieved through its water management program—and recommend areas where additional investment in services might drive even greater reduction in water, energy, and operational costs. “It’s really a ‘virtuous cycle’ with less water, better results, and much lower operating costs, everyone wins,” Beck says. “Because ultimately, our customers want to be good corporate citizens.”
Working together for a better future
Beck believes that the company’s solutions are successful because they’re been tailored to address the unique challenges of water management and the specific needs of customers—and also because they reflect Ecolab’s emphasis on services delivered in person by experts in the field. “The new Microsoft cloud technology has helped us get much better data, on-site and in real time where our people need it most,” says Beck. “We have 47,000 people serving a million customers locations around the world, from nuclear plants to vaccine factories. The right data at the right time is absolutely essential.”
In Microsoft, Ecolab has found a collaborator with a shared vision. “What truly impressed me with our Microsoft collaboration was that it was not about selling us a product,” Beck says. “It was about building something and addressing the world’s water challenge together.”

[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]

cyber-monday-managed-solution
How to Survive Cyber Monday

Cyber Monday brings a lot of great things, like crazy discounts, free shipping codes, and the best deals for online shopping.  The holiday also brings a  lot of bad things, like data breaches and server crashes.  As your employees (and customers) may be online shopping all day, don't let your company data go unprotected.  A Backup and Disaster Recovery (BDR) solution can keep your business safe while the sales commence.

[/vc_column_text][/vc_column][/vc_row][vc_row parallax="content-moving" css=".vc_custom_1465945819577{background-color: #e98922 !important;}"][vc_column][vc_column_text css_animation="appear"]

Securing productivity, collaboration and enterprise data is critically important as organizations digitally transform.

3 Obvious Reasons You Need A Backup & Disaster Recovery Plan

  • You need to protect your company data from security threats and hackers. Did you see all the recent news of political breaches by hackers who exposed “secure” data?
  • Natural disasters do occur and 90% of companies that experience one week of data downtime go out of business within 12 months.
  • Systems do crash, data gets erased or corrupted, viruses attack.
With vast quantities of vital data moving through your business, even with limited resources and budget, it is critical for an organization to have a true business continuity and disaster recovery plan in place. This is the only solution to deliver an advanced insurance policy against loss of data and downtime.
Managed Solution provides a Business Continuity/Backup & Disaster Recovery Service to protect data from loss and prevent costly downtime in the event of a catastrophic server failure.
[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]

 

Learn More About Backup & Disaster Recovery

[/vc_column_text][/vc_column][/vc_row]

 

sql-managed-solution

The public preview of SQL Server on Linux has arrived!

As written on info.microsoft.com
The public preview of the next major release of SQL Server brings the power of SQL Server to both Linux and Windows.  SQL Server enables developers and organizations to build intelligent applications with industry-leading performance and security technologies using their preferred language and environment.  With the next release of SQL Server, you can develop applications with SQL Server on Linux, Windows, Docker, or macOS (via Docker) and then deploy to Linux, Windows, or Docker, on-premises or in the cloud.
You’ll find native Linux installations made easy with familiar RPM and APT packages for Red Hat Enterprise Linux and Ubuntu Linux, and a package for SUSE Linux Enterprise Server will be coming soon as well.  Finally, the public preview for SQL Server is also available on Azure Virtual Machines on Windows and Linux and as images available on Docker Hub, offering a quick and easy installation within minutes.
Tooling on Linux
We have also released updated versions of our flagship SQL Server tools including SQL Server Management Studio (SSMS)Visual Studio SQL Server Data Tools (SSDT) and SQL Server PowerShell with support for the next release of SQL Server on Windows and Linux. We are also excited to announce the new SQL Server extension for Visual Studio Code that is available now on the Visual Studio Code marketplace. Developers can use the SQL extension for VS Code on macOS/Linux/Windows with SQL Server running anywhere (on-premises, on Linux and Windows, in any cloud, in virtual machines, Docker) and with Azure SQL DB and Azure SQL DW. Native command-line tools are also available for SQL Server on Linux.
Get started today
Try the SQL Server on Linux Public Preview today! Get started with the public preview of the next release of SQL Server on Linux, macOS (via Docker) and Windows with our tutorials that show you how to install and use SQL Server on macOS, Docker, Windows, RHEL and Ubuntu and quickly build an app in a programming language of your choice.

 

The fundamentals of Azure identity management

By Curtis Love as written on docs.microsoft.com
Managing identity is just as important in the public cloud as it is on premises. To help with this, Azure supports several different cloud identity technologies. They include these:
  • You can run Windows Server Active Directory (commonly called just AD) in the cloud using virtual machines created with Azure Virtual machines. This approach makes sense when you're using Azure to extend your on-premises datacenter into the cloud.
  • You can use Azure Active Directory to give your users single sign-on to Software as a Service (SaaS) applications. Microsoft's Office 365 uses this technology, for example, and applications running on Azure or other cloud platforms can also use it.
  • Applications running in the cloud or on-premises can use Azure Active Directory Access Control to let users log in using identities from Facebook, Google, Microsoft, and other identity providers.

Running Windows Server Active Directory in virtual machines

Running Windows Server AD in Azure virtual machines is much like running it on-premises. Figure 1 shows a typical example of how this looks.

Azure Active Directory in Virtual Machine

Figure 1: Windows Server Active Directory can run in Azure virtual machines connected to an organization's on-premises datacenter using Azure Virtual Network.
In the example shown here, Windows Server AD is running in VMs created using Azure Virtual Machines, the platform's IaaS technology. These VMs and a few others are grouped into a virtual network connected to an on-premises datacenter using Azure Virtual Network. The virtual network carves out a group of cloud virtual machines that interact with the on-premises network via a virtual private network (VPN) connection. Doing this lets these Azure virtual machines look like just another subnet to the on-premises datacenter. As the figure shows, two of those VMs are running Windows Server AD domain controllers. The other virtual machines in the virtual network might be running applications, such as SharePoint, or being used in some other way, such as for development and testing. The on-premises datacenter is also running two Windows Server AD domain controllers.+
There are several options for connecting the domain controllers in the cloud with those running on premises:+
    • Make all of them part of a single Active Directory domain.
    • Create separate AD domains on-premises and in the cloud that are part of the same forest.
    • Create separate AD forests in the cloud and on-premises, then connect the forests using cross-forest trusts or Windows Server Active Directory Federation Services (AD FS), which can also run in virtual machines on Azure.
Whatever choice is made, an administrator should make sure that authentication requests from on-premises users go to cloud domain controllers only when necessary, since the link to the cloud is likely to be slower than on-premises networks. Another factor to consider in connecting cloud and on-premises domain controllers is the traffic generated by replication. Domain controllers in the cloud are typically in their own AD site, which allows an administrator to schedule how often replication is done. Azure charges for traffic sent out of an Azure datacenter, although not for bytes sent in, which might affect the administrator's replication choices. It's also worth pointing out that while Azure does provide its own Domain Name System (DNS) support, this service is missing features required by Active Directory (such as support for Dynamic DNS and SRV records). Because of this, running Windows Server AD on Azure requires setting up your own DNS servers in the cloud.+
Running Windows Server AD in Azure VMs can make sense in several different situations. Here are some examples:+
    • If you're using Azure Virtual Machines as an extension of your own datacenter, you might run applications in the cloud that need local domain controllers to handle things such as Windows Integrated Authentication requests or LDAP queries. SharePoint, for example, interacts frequently with Active Directory, and so while it's possible to run a SharePoint farm on Azure using an on-premises directory, setting up domain controllers in the cloud will significantly improve performance. (It's important to realize that this isn't necessarily required, however; plenty of applications can run successfully in the cloud using only on-premises domain controllers.)
    • Suppose a faraway branch office lacks the resources to run its own domain controllers. Today, its users must authenticate to domain controllers on the other side of the world - logins are slow. Running Active Directory on Azure in a closer Microsoft datacenter can speed this up without requiring more servers in the branch office.
    • An organization that uses Azure for disaster recovery might maintain a small set of active VMs in the cloud, including a domain controller. It can then be prepared to expand this site as needed to take over for failures elsewhere.
There are also other possibilities. For example, you're not required to connect Windows Server AD in the cloud to an on-premises datacenter. If you wanted to run a SharePoint farm that served a particular set of users, for instance, all of whom would log in solely with cloud-based identities, you might create a standalone forest on Azure. How you use this technology depends on what your goals are. (For more detailed guidance on using Windows Server AD with Azure, see here.)

Using Azure Active Directory

As SaaS applications become more and more common, they raise an obvious question: What kind of directory service should these cloud-based applications use? Microsoft's answer to that question is Azure Active Directory.+
There are two main options for using this directory service in the cloud:+
    • Individuals and organizations that use only SaaS applications can rely on Azure Active Directory as their sole directory service.
    • Organizations that run Windows Server Active Directory can connect their on-premises directory to Azure Active Directory, then use it to give their users single sign-on to SaaS applications.
Figure 2 illustrates the first of these two options, where Azure Active Directory is all that's required.

Azure Active Directory in Virtual Machine+

Figure 2: Azure Active Directory gives an organization's users single sign-on to SaaS applications, including Office 365.+
As the figure shows, Azure AD is a multi-tenant service. This means that it can simultaneously support many different organizations, storing directory information about users at each of them. In this example, a user at organization A is trying to access a SaaS application. This application might be part of Office 365, such as SharePoint Online, or it might be something else - non-Microsoft applications can also use this technology. Because Azure AD supports the SAML 2.0 protocol, all that's required from an application is the ability to interact using this industry standard. (In fact, applications that use Azure AD can run in any datacenter, not just an Azure datacenter.)+
The process begins when the user accesses a SaaS application (step 1). To use this application, the user must present a token issued by Azure AD.+
This token contains information that identifies the user, and it's digitally signed by Azure AD. To get the token, the user authenticates himself to Azure AD by providing a username and password (step 2). Azure AD then returns the token he needs (step 3).+
This token is then sent to the SaaS application (step 4), which validates the token's signature and uses its contents (step 5). Typically, the application will use the identity information the token contains to decide what information the user is allowed to access and perhaps in other ways.+
If the application needs more information about the user than what's contained in the token, it can request this directly from Azure AD using the Azure AD Graph API (step 6). In the initial version of Azure AD, the directory schema is quite simple: It contains just users and groups and relationships among them. Applications can use this information to learn about connections between users. For example, suppose an application needs to know who this user's manager is to decide whether he's allowed access to some chunk of data. It can learn this by querying Azure AD through the Graph API.+
The Graph API uses an ordinary RESTful protocol, which makes it straightforward to use from most clients, including mobile devices. The API also supports the extensions defined by OData, adding things such as a query language to let clients access data in more useful ways. (For more on OData, see Introducing OData.) Because the Graph API can be used to learn about relationships between users, it lets applications understand the social graph that's embedded in the Azure AD schema for a particular organization (which is why it's called the Graph API). And to authenticate itself to Azure AD for Graph API requests, an application uses OAuth 2.0.+
If an organization doesn't use Windows Server Active Directory - it has no on-premises servers or domains - and relies solely on cloud applications that use Azure AD, using just this cloud directory would give the firm's users single sign-on to all of them. Yet while this scenario gets more common every day, most organizations still use on-premises domains created with Windows Server Active Directory. Azure AD has a useful role to play here as well, as Figure 3 shows.
Azure Active Directory in Virtual Machine Figure 3: An organization can federate Windows Server Active Directory with Azure Active Directory to give its users single sign-on to SaaS applications.+
In this scenario, a user at organization B wishes to access a SaaS application. Before she does this, the organization's directory administrators must establish a federation relationship with Azure AD using AD FS, as the figure shows. Those admins must also configure data synchronization between the organization's on-premises Windows Server AD and Azure AD. This automatically copies user and group information from the on-premises directory to Azure AD. Notice what this allows: In effect, the organization is extending its on-premises directory into the cloud. Combining Windows Server AD and Azure AD in this way gives the organization a directory service that can be managed as a single entity, while still having a footprint both on-premises and in the cloud.+
To use Azure AD, the user first logs in to her on-premises Active Directory domain as usual (step 1). When she tries to access the SaaS application (step 2), the federation process results in Azure AD issuing her a token for this application (step 3). (For more on how federation works, see Claims-Based Identity for Windows: Technologies and Scenarios.) As before, this token contains information that identifies the user, and it's digitally signed by Azure AD. This token is then sent to the SaaS application (step 4), which validates the token's signature and uses its contents (step 5). And is in the previous scenario, the SaaS application can use the Graph API to learn more about this user if necessary (step 6).+
Today, Azure AD isn't a complete replacement for on-premises Windows Server AD. As already mentioned, the cloud directory has a much simpler schema, and it's also missing things such as group policy, the ability to store information about machines, and support for LDAP. (In fact, a Windows machine can't be configured to let users log in to it using nothing but Azure AD - this isn't a supported scenario.) Instead, the initial goals of Azure AD include letting enterprise users access applications in the cloud without maintaining a separate login and freeing on-premises directory administrators from manually synchronizing their on-premises directory with every SaaS application their organization uses. Over time, however, expect this cloud directory service to address a wider range of scenarios.

Using Azure Active Directory Access Control

Cloud-based identity technologies can be used to solve a variety of problems. Azure Active Directory can give an organization's users single sign-on to multiple SaaS applications, for example. But identity technologies in the cloud can also be used in other ways.+
Suppose, for instance, that an application wishes to let its users log in using tokens issued by multiple identity providers (IdPs). Lots of different identity providers exist today, including Facebook, Google, Microsoft, and others, and applications frequently let users sign in using one of these identities. Why should an application bother to maintain its own list of users and passwords when it can instead rely on identities that already exist? Accepting existing identities makes life simpler both for users, who have one less username and password to remember, and for the people who create the application, who no longer need to maintain their own lists of usernames and passwords.+
But while every identity provider issues some kind of token, those tokens aren't standard - each IdP has its own format. Furthermore, the information in those tokens also isn't standard. An application that wishes to accept tokens issued by, say, Facebook, Google, and Microsoft is faced with the challenge of writing unique code to handle each of these different formats.+
But why do this? Why not instead create an intermediary that can generate a single token format with a common representation of identity information? This approach would make life simpler for the developers who create applications, since they now need to handle only one kind of token. Azure Active Directory Access Control does exactly this, providing an intermediary in the cloud for working with diverse tokens. Figure 4 shows how it works+
Azure Active Directory in Virtual Machine Figure 4: Azure Active Directory Access Control makes it easier for applications to accept identity tokens issued by different identity providers.+
The process begins when a user attempts to access the application from a browser. The application redirects her to an IdP of her choice (and that the application also trusts). She authenticates herself to this IdP, such as by entering a username and password (step 1), and the IdP returns a token containing information about her (step 2).+
As the figure shows, Access Control supports a range of different cloud-based IdPs, including accounts created by Google, Yahoo, Facebook, Microsoft (formerly known as Windows Live ID), and any OpenID provider. It also supports identities created using Azure Active Directory and, through federation with AD FS, Windows Server Active Directory. The goal is to cover the most commonly used identities today, whether they're issued by IdPs in the cloud or on-premises.+
Once the user's browser has an IdP token from her chosen IdP, it sends this token to Access Control (step 3). Access Control validates the token, making sure that it really was issued by this IdP, then creates a new token according to the rules that have been defined for this application. Like Azure Active Directory, Access Control is a multi-tenant service, but the tenants are applications rather than customer organizations. Each application can get its own namespace, as the figure shows, and can define various rules about authorization and more.+
These rules let each application's administrator define how tokens from various IdPs should be transformed into an Access Control token. For example, if different IdPs use different types for representing usernames, Access Control rules can transform all of these into a common username type. Access Control then sends this new token back to the browser (step 4), which submits it to the application (step 5). Once it has the Access Control token, the application verifies that this token really was issued by Access Control, then uses the information it contains (step 6).+
While this process might seem a little complicated, it actually makes life significantly simpler for the creator of the application. Rather than handle diverse tokens containing different information, the application can accept identities issued by multiple identity providers while still receiving only a single token with familiar information. Also, rather than require each application to be configured to trust various IdPs, these trust relationships are instead maintained by Access Control - an application need only trust it.+
It's worth pointing out that nothing about Access Control is tied to Windows - it could just as well be used by a Linux application that accepted only Google and Facebook identities. And even though Access Control is part of the Azure Active Directory family, you can think of it as an entirely distinct service from what was described in the previous section. While both technologies work with identity, they address quite different problems (although Microsoft has said that it expects to integrate the two at some point).+
Working with identity is important in nearly every application. The goal of Access Control is to make it easier for developers to create applications that accept identities from diverse identity providers. By putting this service in the cloud, Microsoft has made it available to any application running on any platform.

[vc_row gmbt_prlx_parallax="up" font_color="#ffffff" css=".vc_custom_1467053554901{padding-top: 170px !important;padding-right: 0px !important;padding-bottom: 190px !important;padding-left: 0px !important;background: rgba(234,137,34,0.66) url(https://managedsolut.wpengine.com/wp-content/uploads/2016/10/webinar-series-managed-solution-pay-as-you-go.jpg) !important;background-position: center !important;background-repeat: no-repeat !important;background-size: cover !important;*background-color: rgb(234,137,34) !important;}"] [vc_column][vc_column_text]

Webinar: Minimize Cost with “Pay As You Go” Licensing

 

Pay per user, per month has revolutionized the way companies are forecasting expenses, consuming technology and boosting productivity. Microsoft will be trimming down more EA's as it ushers more customers to pay as you go. Watch this on demand webinar to hear how it works.


[/vc_column_text][/vc_column][/vc_row]

microsoft-azure-logo

Introducing the Windows Azure Pack connector: two clouds, one portal

As written on msdn.microsoft.com
Microsoft IT has created a more seamless way to manage hybrid clouds. Now available for download, the Windows Azure Pack connector lets administrators and tenants use one portal to manage infrastructure as a service (IaaS) virtual machines in both private clouds and public Azure subscriptions.
We’ve heard from our partners—they’ve bought into the vision of the hybrid cloud. Until now, hybrid cloud administration has always required multiple portals. Microsoft IT set out to create a more seamless connection between private and public cloud environments and their management systems. We are pleased to provide the result as an open-source solution for everyone.
Partners provided great feedback on the idea, and told us that they need a more streamlined way to manage IaaS VMs on both private and public clouds. At the same time, they wanted strong oversight and governance. We also heard that partners are sensitive to their tenants. Simply put, “Don’t make my customers leave my portal!”
Microsoft IT responded to a cloud management challenge that was first recognized internally, realized that the solution resonated with a broader partner base, and then quickly created and released an open-source solution that scaled and engaged the larger community.
You’ve probably used a Windows Azure Pack portal to manage IaaS VMs on your private clouds. With the Windows Azure Pack connector, the portal has now been extended to create and manage public Azure subscriptions and VMs. The connector, which supports the Azure Resource Manager fabric, provides a “single pane of glass” administration experience. You create and manage both private cloud VMs and public Azure subscriptions and VMs.
With the Windows Azure Pack connector, administrators and tenants onboard and customize Azure subscriptions with an end-to-end installer and an automated installation test suite. Customizations for administrators include a variety of operating system images and different VM sizes. Tenants use the same portal to provision and configure Azure VMs.
Microsoft IT originally created the connector for internal use. We are pleased to extend the open-source connector to our partner network.
Download it today at https://github.com/microsoft/phoenix.
© 2016 Microsoft Corporation. All rights reserved. Microsoft and Windows are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries. The names of actual companies and products mentioned herein may be the trademarks of their respective owners. This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS OR IMPLIED, IN THIS SUMMARY.

SQL Server as a Machine Learning Model Management System

By Rimma Nehme as written on blogs.technet.microsoft.com

Machine Learning Model Management

If you are a data scientist, business analyst or a machine learning engineer, you need model management – a system that manages and orchestrates the entire lifecycle of your learning model. Analytical models must be trained, compared and monitored before deploying into production, requiring many steps to take place in order to operationalize a model’s lifecycle. There isn’t a better tool for that than SQL Server!

SQL Server as an ML Model Management System

In this blog, I will describe how SQL Server can enable you to automate, simplify and accelerate machine learning model management at scale – from build, train, test and deploy all the way to monitor, retrain and redeploy or retire. SQL Server treats models just like data – storing them as serialized varbinary objects. As a result, it is pretty agnostic to the analytics engines that were used to build models, thus making it a pretty good model management tool for not only R models (because R is now built-in into SQL Server 2016) but for other runtimes as well.
SELECT * FROM [dbo].[models]

Machine Learning model is just like data inside SQL Server

Figure 1: Machine Learning model is just like data inside SQL Server.

SQL Server approach to machine learning model management is an elegant solution. While there are existing tools that provide some capabilities for managing models and deployment, using SQL Server keeps the models “close” to data, thus leveraging all the capabilities of a Management System for Data to be now nearly seamlessly transferrable to machine learning models (see Figure 2). This can help simplify the process of managing models tremendously resulting in faster delivery and more accurate business insights.

Publishing Intelligence To Where Data Lives

Figure 2: Pushing machine learning models inside SQL Server 2016 (on the right), you get throughput, parallelism, security, reliability, compliance certifications and manageability, all in one. It’s a big win for data scientists and developers – you don’t have to build the management layer separately. Furthermore, just like data in databases can be shared across multiple applications, you can now share the predictive models.  Models and intelligence become “yet another type of data”, managed by the SQL Server 2016.

Why Machine Learning Model Management?

Today there is no easy way to monitor, retrain and redeploy machine learning models in a systematic way. In general, data scientists collect the data they are interested in, prepare and stage the data, apply different machine learning techniques to find a best-of-class model, and continually tweak the parameters of the algorithm to refine the outcomes. Automating and operationalizing this process is difficult. For example, a data scientist must code the model, select parameters and a runtime environment, train the model on batch data, and monitor the process to troubleshoot errors that might occur. This process is repeated iteratively on different parameters and machine learning algorithms, and after comparing the models on accuracy and performance, the model can then be deployed.
Currently, there is no standard method for comparing, sharing or viewing models created by other data scientists, which results in siloed analytics work. Without a way to view models created by others, data scientists leverage their own private library of machine learning algorithms and datasets for their use cases. As models are built and trained by many data scientists, the same algorithms may be used to build similar models, particularly if a certain set of algorithms is common for a business’s use cases. Over time, models begin to sprawl and duplicate unnecessarily, making it more difficult to establish a centralized library.

Why SQL Server 2016 for machine learning model management

Figure 3: Why SQL Server 2016 for machine learning model management.

In light of these challenges, there is an opportunity to improve model management.

Why SQL Server 2016 for ML Model Management?

There are many benefits to using SQL Server for model management. Specifically, you can use SQL Server 2016 for the following:
  • Model Store and Trained Model Store: SQL Server can efficiently store a table of “pre-baked” models of commonly used machine learning algorithms that can be trained on various datasets (already present in the database), as well as trained models for deployment against a live stream for real-time data.
  • Monitoring service and Model Metadata Store: SQL Server can provide a service that monitors the status of the machine learning model during its execution on the runtime environment for the user, as well as any metadata about its execution that is then stored for the user.
  • Templated Model Interfaces: SQL Server can store interfaces that abstract the complexity of machine learning algorithms, allowing users to specify the inputs and outputs for the model.
  • Runtime Verification (for External Runtimes): SQL Server can provide a runtime verification mechanism using a stored procedure to determine which runtime environments can support a model prior to execution, helping to enable faster iterations for model training.
  • Deployment and Scheduler: Using SQL Server’s trigger mechanism, automatic scheduling and an extended stored procedure you can perform automatic training, deployment and scheduling of models on runtime environments, obviating the need to operate the runtime environments during the modeling process.
Here is the list of specific capabilities that makes the above possible:

ML Model Performance:

  • Fast training and scoring of models using operational analytics (in-memory OLTP and in-memory columnstore).
  • Monitor and optimize model performance via Query store and DMVs. Query store is like a “black box” recorder on an airplane. It records how queries have executed and simplifies performance troubleshooting by enabling you to quickly find performance differences caused by changes in query plans. The feature automatically captures a history of queries, plans, and runtime statistics, and retains these for your review. It separates data by time windows, allowing you to see database usage patterns and understand when query plan changes happened on the server.
  • Hierarchical model metadata (that is easily updateable) using native JSON support: Expanded support for un-structured JSON data inside SQL Server enables you to store properties of your models using JSON format. Then you can process JSON data just like any other data inside SQL. It enables you to organize collections of your model properties, establish relationships between them, combine strongly-typed scalar columns stored in tables with flexible key/value pairs stored in JSON columns, and query both scalar and JSON values in one or multiple tables using full Transact-SQL. You can store JSON in In-memory or Temporal tables, you can apply Row-Level Security predicates on JSON text, and so on.
  • Temporal support for models: SQL Server 2016’s temporal tables can be used for keeping track of the state of models at any specific point in time. Using temporal tables in SQL Server you can: (a) understand model usage trends over time, (b) track model changes over time, (c) audit all changes to models, (d) recover from accidental model changes and application errors.

ML Model Security and Compliance:

  • Sensitive model encryption via Always Encrypted: Always Encrypted can protect model at rest and in motion by requiring the use of an Always Encrypted driver when client applications to communicate with the database and transfer data in an encrypted state.
  • Transparent Data Encryption (TDE) for models. TDE is the primary SQL Server encryption option. TDE enables you to encrypt an entire database that may store machine learning models. Backups for databases that use TDE are also encrypted. TDE protects the data at rest and is completely transparent to the application and requires no coding changes to implement.
  • Row-Level Security enables you to protect the model in a table row-by-row, so a particular user can only see the models (rows) to which they are granted access.
  • Dynamic model (data) masking obfuscates a portion of the model data to anyone unauthorized to view it. Return masked data to non-privileged users (e.g. credit card numbers).
  • Change model capture can be used to capture insert, update, and delete activity applied to models stored in tables in SQL Server, and to make the details of the changes available in an easily consumed relational format. The change tables used by change data capture contain columns that mirror the column structure of a tracked source table, along with the metadata needed to understand the changes that have occurred.
  • Enhanced model auditing. Auditing is an important mechanism for many organizations to serve as a checks and balances.  In SQL Server 2016 are there any new Auditing features to support model auditing. You can implement user-defined audit, audit filtering and audit resilience.

ML Model Availability:

  • AlwaysOn for model availability and champion-challenger. An availability group in SQL Server supports a failover environment. An availability group supports a set of primary databases and one to eight sets of corresponding secondary databases. Secondary databases are not backups. In addition, you can have automatic failover based on DB health. One interesting thing about availability groups in SQL Server with readable secondaries is that they enable “champion-challenger” model setup. The champion model runs on a primary, whereas challenger models are scoring and being monitored on the secondaries for accuracy (without having any impact on the performance of the transactional database). Whenever a new champion model emerges, it’s easy to enable it on the primary.

ML Model Scalability

  • Enhanced model caching can facilitate model scalability and high performance. SQL Server enables caching with automatic, multiple TempDB files per instance in multi-core environments.
In summary, SQL Server delivers the top-notch data management with performance, security, availability, and scalability built into the solution. Because SQL Server is designed to meet security standards, it has minimal total surface area and database software that is inherently more secure. Enhanced security, combined with built-in, easy-to-use tools and controlled model access can help organizations meet strict compliance policies. Integrated high availability solutions enable faster failover and more reliable backups – and they are easier to configure, maintain, and monitor, which helps organizations reduce the total cost of model management (TCMM). In addition, SQL Server supports complex data types and non-traditional data sources, and it handles them with the same attention – so data scientist can focus on improving the model quality and outsource all of the model management to SQL Server.

Conclusion

Using SQL Server 2016 you can do model management with ease. SQL Server is unique from other machine learning model management tools, because it is a database engine, and is optimized for data management. The key insight here is that “models are just like data” to an engine like SQL Server, and as such we can leverage most of the mission-critical features of data management built into SQL Server for machine learning models. Using SQL Server for ML model management, an organization can create an ecosystem for harvesting analytical models, enabling data scientists and business analysts to discover the best models and promote them for use. As companies rely more heavily on data analytics and machine learning, the ability to manage, train, deploy and share models that turn analytics into action-oriented outcomes is essential.

Managed Solution is a full-service technology firm that empowers business by delivering, maintaining and forecasting the technologies they’ll need to stay competitive in their market place. Founded in 2002, the company quickly grew into a market leader and is recognized as one of the fastest growing IT Companies in Southern California.

We specialize in providing full managed services to businesses of every size, industry, and need.

tokyo-university-of-tech-managed-solution

Tokyo University of Technology Implements Futuristic Environment Using Cloud-Based Solutions

As written on customers.microsoft.com
To produce qualified students based on international standards, the Tokyo University of Technology (hereafter referred to as TUT) has always taken an active approach to ICT investment. However, the 100 or so servers that the university currently used as part of its ICT assets were rapidly becoming obsolete. The system needed to be updated. After a year’s worth of discussions, plans emerged for building a university-wide, cloud-based system and a core database that would cut operating loads to the lowest possible level.

Thus, TUT chose Microsoft Azure, Office 365, Microsoft Dynamics CRM and other services and technologies offered by Microsoft.

Business Needs

Turning away from aging ICT assets and conceiving the best, most up-to-date system environment
Since its establishment in 1986, the university has promoted three particular aims: training in use of technologies and expert scientific theory for the betterment of society, engagement in advanced research and passing research findings back to society and creating an ideal educational and research environment.
The school takes an active approach to investing in its ICT environment. For instance, when it set up the School of Media Science in 1999 (the first one of its kind in Japan), it required students to have a laptop. The school also actively promoted use of the internet. When the School of Computer Science and the School of Bionics (currently the Department of Applied Biology) were set up in 2013, as a result of the School of Engineering’s reorganization, the requirement that students have laptops was applied university-wide. At the same time, to enable students to use the internet freely, the school implemented a wired ethernet environment university-wide.
However, Kazuya Tago, Head of the Media Center and Professor at the School of Computer Science at TUT, realizes that the network environment reached its zenith more than 10 years ago. It has now become obsolete from a technological perspective: “What the university asked for most urgently was a wireless network. To meet this need, each university institute created wireless access points of its own motive. This solution met local demands, but we still had not achieved the comfort of being able to connect to the internet throughout the whole campus.”
Some 100 servers installed for administrative systems had also started aging. The expenses necessary for maintaining the system started to cut into budgets for equipping students with the latest IT environment. Current students use thin notebooks without ethernet ports, and in their daily lives they have acquired a good command of smart phones. However, no one would have predicted such a state 10 years ago. We then decided that we could not afford to continue with a “maintenance” approach to our existing assets; we had to come up with a future-oriented concept for the best and most up-to-date ICT environment. Thus, since April 2013, TUT has been undertaking bold innovations; what we call the “university-wide full shift of the ICT environment to the cloud”.
The university-wide full shift to the cloud at TUT was based on conclusions gathered during discussions of various options over the course of one year. These discussions began in 2012. Professor Tago said that during this discussion period “we realized that various technologies that would be necessary suddenly appeared right before our very eyes.” These were technologies and services that Microsoft offers; like Microsoft Azure or Microsoft Office 365 or Microsoft Dynamics CRM.

Solution

The new solution for a full shift to the cloud at TUT combined usage of cloud services for PaaS and SaaS and effective system operations thanks to building up the school’s core database (hereafter referred to as “core DB”). TUT asked suppliers to conceive the system such that it would plan for the school’s needs 10-15 years ahead and deliver the best possible system environment as concerns functionality, expansion, flexibility, costs, etc. TUT administrators decided to use a combination of three types of cloud services: “Platform as a Service” (PaaS), “Infrastructure as a Service” (IaaS) and “Software as a Service” (SaaS).
Information entered from all systems will be stored in a newly created core DB; and through a redeveloped, university-wide, wireless network, it will be possible to use data extensively for CRM (Customer Relationship Management) and for university administration.
Existing solutions can be utilized in their current state. However, in order to reduce workloads (when building the system), it was important that the PaaS, where the OS environment and security are guaranteed above certain level, become our cloud-based ICT infrastructure. We also considered IaaS services like Amazon Web Services (hereafter as AWS) but abandoned this idea at early stage.
Using Microsoft Azure for IaaS and building and operating Oracle database
Professor Tago now recalls that “only Microsoft Azure and Office 365 services matched all university-determined criteria.” Moreover, as work proceeded, the both of services were rapidly evolved and inadequate features disappeared.
Another critical event involved work with the university’s Oracle database. It was a big moment when Microsoft announced Microsoft Azure’s compatibility with Oracle. “TUT had used its Oracle database for a long time to support existing administration systems. Now, thanks to the compatibility, the database could be easily moved to the cloud,” explains Yuzuru Kimura, CEO of Page One Co., Ltd., who was in charge of designing and building the system. Microsoft Azure is now used both for the PaaS and Iaas.
Using Microsoft Lync Online innovates the communication environment for TUT faculty
TUT uses Office 365 for mail or portal sites in the faculty/staff-oriented ICT environment. Through authentication infrastructure using Active Directory, it is possible to access to the system from within and from outside the university. Lync Online offers new functions that were lacking in the previous environment; i.e. information on whether faculty members are present. Lync Online also allows a choice of multiple communication tools for instant messaging, for emails or for web conferences.
Flexible system by end user computing (hereafter as EUC) uses Microsoft SharePoint Server and Microsoft Dynamics CRM
TUT has the highest expectations for use of SharePoint and Microsoft Dynamics CRM. It also has high hopes for Microsoft Dynamics CRM as a data viewer that enables the extraction of data entered into and stored in the core DB. Data also remain anonymous and can be sorted and read based on various criteria. The web template is customized in several ways and works together with SharePoint; enabling fluid use of data within the university.

Benefits

Even better achievement by promoting the EUC and rational systems that can also be diverted to companies
Even though TUT has started its trial efforts to use a full shift to the cloud to reduce workloads for system operations to the lowest possible level, professor Tago gave a small warning: “We have only taken the first step. From now on, the quantity of internal university data that is processed by any university in the world will continue to grow at a fast pace. Given this fact, we feel it makes sense to accumulate databases in the cloud. We also came to the conclusion that the intermingled use of clouds, from PaaS to SaaS, is more rational in terms of operations and costs.”
Mr. Kimura as an IT vendor also agrees with the new system concept:

“Even in the case that EUC at TUT moves forward and the quantity of data in the core DB swells, as long as it is managed using Microsoft Azure, no problems will rise. The expenses for data accumulation will be surprisingly small; even in comparison with AWS, running costs can be reduced.”

Professor Tago concludes with the following expectations:

“This is our first effort to make a full shift to the cloud at our university. As for quantity of data and operational rules, we have started from scratch. However, we ultimately succeeded thanks to a wide range of affordable and flexible cloud services provided by Microsoft Azure. Thanks to solutions like Dynamics CRM and SharePoint, information analysis is proceeding well and sophisticated IR could be realized. This achievement will become evident after further use. Personally, I think that the structure that we have put together could also become a useful model for businesses.”

Microsoft CityNext
Empowering more sustainable, prosperous, and economically competitive cities—with a simplified approach that puts people first! For more information please visit:

www.microsoft.com/citynext

 

Managed Solution is a full-service technology firm that empowers business by delivering, maintaining and forecasting the technologies they’ll need to stay competitive in their market place. Founded in 2002, the company quickly grew into a market leader and is recognized as one of the fastest growing IT Companies in Southern California.

We specialize in providing full Microsoft solutions to businesses of every size, industry, and need.

Contact us Today!

Chat with an expert about your business’s technology needs.