[vc_row][vc_column][vc_column_text]

Amazon Machine Learning gives data science newbies easy-to-use solutions for the most common problems

By Martin Heller as written on infoworld.com
As a physicist, I was originally trained to describe the world in terms of exact equations. Later, as an experimental high-energy particle physicist, I learned to deal with vast amounts of data with errors and with evaluating competing models to describe the data. Business data, taken in bulk, is often messier and harder to model than the physics data on which I cut my teeth. Simply put, human behavior is complicated, inconsistent, and not well understood, and it's affected by many variables.
If your intention is to predict which previous customers are most likely to subscribe to a new offer, based on historical patterns, you may discover there are non-obvious correlations in addition to obvious ones, as well as quite a bit of randomness. When graphing the data and doing exploratory statistical analyses don’t point you at a model that explains what’s happening, it might be time for machine learning.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]

Amazon’s approach to a machine learning service is intended to work for analysts to understand the business problem being solved, whether or not they understand data science and machine learning algorithms. As we’ll see, that intention gives rise to different offerings and interfaces than you’ll find in Microsoft Azure Machine Learning (click for my review), although the results are similar.
With both services, you start with historical data, identify a target for prediction from observables, extract relevant features, feed them into a model, and allow the system to optimize the coefficients of the model. Then you evaluate the model, and if it’s acceptable, you use it to make predictions. For example, a bank may want to build a model to predict whether a new credit card charge is legitimate or fraudulent, and a manufacturer may want to build a model to predict how much a potential customer is likely to spend on its products.
In general, you approach Amazon Machine Learning by first uploading and cleaning up your data; then creating, training, and evaluating an ML model; and finally by creating batch or real-time predictions. Each step is iterative, as is the whole process. Machine learning is not a simple, static, magic bullet, even with the algorithm selection left to Amazon.

 

amazon puts machine learning in reach 2 - managed solution

Data sources

Amazon Machine Learning can read data -- in plain-text CSV format -- that you have stored in Amazon S3. The data can also come to S3 automatically from Amazon Redshift and Amazon RDS for MySQL. If your data comes from a different database or another cloud, you’ll need to get it into S3 yourself.
When you create a data source, Amazon Machine Learning reads your input data; computes descriptive statistics on its attributes; and stores the statistics, the correlations with the target, a schema, and other information as part of the data source object. The data is not copied. You can view the statistics, invalid value information, and more on the data source’s Data Insights page.
The schema stores the name and data type of each field; Amazon Machine Learning can read the name from the header row of the CSV file and infer the data type from the values. You can override these in the console.
You actually need two data sources for Amazon Machine Learning: one for training the model (usually 70 percent of the data) and one for evaluating the model (usually 30 percent of the data). You can presplit your data yourself into two S3 buckets or ask Amazon Machine Learning to split your data either sequentially or randomly when you create the two data sources from a single bucket.
As I discussed earlier, all of the steps in the Amazon Machine Learning process are iterative, including this one. What happens to data sources over time is that the data drifts, for a variety of reasons. When that happens, you have to replace your data source with newer data and retrain your model.

Training machine learning models

Amazon Machine Learning supports three kinds of models -- binary classification, multiclass classification, and regression -- and one algorithm for each type. For optimization, Amazon Machine Learning uses Stochastic Gradient Descent (SGD), which makes multiple sequential passes over the training data and updates feature weights for each sample mini-batch to try to minimize the loss function. Loss functions reflect the difference between the actual value and the predicted value. Gradient descent optimization only works well for continuous, differentiable loss functions, such as the logistic and squared loss functions.
For binary classification, Amazon Machine Learning uses logistic regression (logistic loss function plus SGD). For multiclass classification, Amazon Machine Learning uses multinomial logistic regression (multinomial logistic loss plus SGD). For regression, it uses linear regression (squared loss function plus SGD). It determines the type of machine learning task being solved from the type of the target data.
While Amazon Machine Learning does not offer as many choices of model as you’ll find in Microsoft’s Azure Machine Learning, it does give you robust, relatively easy-to-use solutions for the three major kinds of problems. If you need other kinds of machine learning models, such as unguided cluster analysis, you need to use them outside of Amazon Machine Learning -- perhaps in an RStudio or Jupyter Notebook instance that you run in an Amazon Ubuntu VM, so it can pull data from your Redshift data warehouse running in the same availability zone.

Recipes for machine learning

Often, the observable data do not correlate with the goal for the prediction as well as you’d like. Before you run out to collect other data, you usually want to extract features from the observed data that correlate better with your target. In some cases this is simple, in other cases not so much.
To draw on a physical example, some chemical reactions are surface-controlled, and others are volume-controlled. If your observations were of X, Y, and Z dimensions, then you might want to try to multiply these numbers to derive surface and volume features.
For an example involving people, you may have recorded unified date time markers, when in fact the behavior you are predicting varies with time of day (say, morning versus evening rush hours) and day of week (specifically workdays versus weekends and holidays). If you have textual data, you might discover that the goal correlates better with bigrams (two words taken together) than unigrams (single words), or the input data is in random cases and should be converted to lowercase for consistency.
Choices of features in Amazon Machine Learning are held in recipes. Once the descriptive statistics have been calculated for a data source, Amazon will create a default recipe, which you can either use or override in your machine learning models on that data. While Amazon Machine Learning doesn’t give you a sexy diagrammatic option to define your feature selection the way that Microsoft’s Azure Machine Learning does, it gives you what you need in a no-nonsense manner.

Evaluating machine learning models

I mentioned earlier that you typically reserve 30 percent of the data for evaluating the model. It’s basically a matter of using the optimized coefficients to calculate predictions for all the points in the reserved data source, tallying the loss function for each point, and finally calculating the statistics, including an overall prediction accuracy metric, and generating the visualizations to help explore the accuracy of your model beyond the prediction accuracy metric.
For a regression model, you’ll want to look at the distribution of the residuals in addition to the root mean square error. For binary classification models, you’ll want to look at the area under the Receiver Operating Characteristic curve, as well as the prediction histograms. After training and evaluating a binary classification model, you can choose your own score threshold to achieve your desired error rates.

amazon puts machine learning in reach 3 - managed solution

For multiclass models the macro-average F1 score reflects the overall predictive accuracy, and the confusion matrix shows you where the model has trouble distinguishing classes. Once again, Amazon Machine Learning gives you the tools you need to do the evaluation in parsimonious form: just enough to do the job.

Interpreting predictions

Once you have a model that meets your evaluation requirements, you can use it to set up a real-time Web service or to generate a batch of predictions. Bear in mind, however, that unlike physical constants, people’s behavior varies over time. You’ll need to check the prediction accuracy metrics coming out of your models periodically and retrain them as needed.
As I worked with Amazon Machine Learning and compared it with Azure Machine Learning, I constantly noticed that Amazon lacks most of the bells and whistles in its Azure counterpart, in favor of giving you merely what you need. If you’re a business analyst doing machine learning predictions for one of the three supported models, Amazon Machine Learning could be exactly what the doctor ordered. If you’re a sophisticated data analyst, it might not do quite enough for you, but you’ll probably have your own preferred development environment for the more complex cases.

[/vc_column_text][/vc_column][/vc_row]

how real businesses are using machine learning - ms

How real businesses are using machine learning

By Lukas Biewald as written on techcrunch.com
There is no question that machine learning is at the top of the hype curve. And, of course, the backlash is already in full force: I’ve heard that old joke “Machine learning is like teenage sex; everyone is talking about it, no one is actually doing it” about 20 times in the past week alone.
But from where I sit, running a company that enables a huge number of real-world machine-learning projects, it’s clear that machine learning is already forcing massive changes in the way companies operate.
It’s not just futuristic-looking products like Siri and Amazon Echo. And it’s not just being done by companies that we normally think of as having huge R&D budgets like Google and Microsoft. In reality, I would bet that nearly every Fortune 500 company is already running more efficiently — and making more money — because of machine learning.
So where is it happening? Here are a few behind-the-scenes applications that make life better every day.

Making user-generated content valuable

The average piece of user-generated content (UGC) is awful. It’s actually way worse than you think. It can be rife with misspellings, vulgarity or flat-out wrong information. But by identifying the best and worst UGC, machine-learning models can filter out the bad and bubble up the good without needing a real person to tag each piece of content.
It’s not just Google that needs smart search results.
A similar thing happened a while back with spam emails. Remember how bad spam used to be? Machine learning helped identify spam and, basically, eradicate it. These days, it’s far more uncommon to see spam in your inbox each morning. Expect that to happen with UGC in the near future.
Pinterest uses machine learning to show you more interesting content. Yelp uses machine learning to sort through user-uploaded photos. NextDoor uses machine learning to sort through content on their message boards. Disqus uses machine learning to weed out spammy comments.

Finding products faster

It’s no surprise that as a search company, Google was always at the forefront of hiring machine-learning researchers. In fact, Google recently put an artificial intelligence expert in charge of search. But the ability to index a huge database and pull up results that match a keyword has existed since the 1970s. What makes Google special is that it knows which matching result is the most relevant; the way that it knows is through machine learning.
But it’s not just Google that needs smart search results. Home Depot needs to show which bathtubs in its huge inventory will fit in someone’s weird-shaped bathroom. Apple needs to show relevant apps in its app store. Intuit needs to surface a good help page when a user types in a certain tax form.
Successful e-commerce startups from Lyst to Trunk Archive employ machine learning to show high-quality content to their users. Other startups, like Rich Relevance and Edgecase, employ machine-learning strategies to give their commerce customers the benefits of machine learning when their users are browsing for products.

Engaging with customers

You may have noticed “contact us” forms getting leaner in recent years. That’s another place where machine learning has helped streamline business processes. Instead of having users self-select an issue and fill out endless form fields, machine learning can look at the substance of a request and route it to the right place.
Big companies are investing in machine learning … because they’ve seen positive ROI.
That seems like a small thing, but ticket tagging and routing can be a massive expense for big businesses. Having a sales inquiry end up with the sales team or a complaint end up instantly in the customer service department’s queue saves companies significant time and money, all while making sure issues get prioritized and solved as fast as possible.

Understanding customer behavior

Machine learning also excels at sentiment analysis. And while public opinion can sometimes seem squishy to non-marketing folks, it actually drives a lot of big decisions.
For example, say a movie studio puts out a trailer for a summer blockbuster. They can monitor social chatter to see what’s resonating with their target audience, then tweak their ads immediately to surface what people are actually responding to. That puts people in theaters.
Another example: A game studio recently put out a new title in a popular video game line without a game mode that fans were expecting. When gamers took to social media to complain, the studio was able to monitor and understand the conversation. The company ended up changing their release schedule in order to add the feature, turning detractors into promoters.
How did they pull faint signals out of millions of tweets? They used machine learning. And in the past few years, this kind of social media listening through machine learning has become standard operating procedure.

What’s next?

Dealing with machine-learning algorithms is tricky. Normal algorithms are predictable, and we can look under the hood and see how they work. In some ways, machine-learning algorithms are more like people. As users, we want answers to questions like “why did The New York Times show me that weird ad” or “why did Amazon recommend that funny book?”
In fact, The New York Times and Amazon don’t really understand the specific results themselves any more than our brains know why we chose Thai food for dinner or got lost down a particular Wikipedia rabbit hole.
If you were getting into the machine-learning field a decade ago, it was hard to find work outside of places like Google and Yahoo. Now, machine learning is everywhere. Data is more prevalent than ever, and it’s easier to access. New products like Microsoft Azure ML and IBM Watson drive down both the setup cost and ongoing cost of state-of-the-art machine-learning algorithms.
At the same time, VCs have started funds — from WorkDay’s Machine Learning fund to Bloomberg Beta to the Data Collective — that are completely focused on funding companies across nearly every industry that use machine learning to build a sizeable advantage.
Most of the conversation about machine learning in popular culture revolves around AI personal assistants and self-driving cars (both applications are very cool!), but nearly every website you interact with is using machine learning behind the scenes. Big companies are investing in machine learning not because it’s a fad or because it makes them seem cutting edge. They invest because they’ve seen positive ROI. And that’s why innovation will continue.

[vc_row][vc_column][vc_column_text]

sql-beats-oracle-infograph

4 Ways SQL Server Beats Oracle:

Intelligent Cloud Database: Everything Built In:
  • In-memory across all workloads
  • Scale Performance on the Fly, Without App Downtime
  • Highest performing data warehouse
  • Voted least vulnerable 6 years in a row
  • End-to-end mobile BI on any device
  • In-database Advanced Analytics
VIEW FULL INFOGRPAHIC

 

Managed Solution’s Team has the experience and expertise to architect SQL database and reporting systems tailored for your environment. Contact us for more information 800-307-0296


[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]SQL server 2016 2 - managed solution

Break free from Oracle with SQL Server Competitive Migration Offer

Break free from expensive data solutions with mysterious, frustrating pricing plans and surprise add-ons. With SQL Server, you get breakthrough in-memory performance across all workloads, mission critical high availability, and business intelligence and advanced analytics tools—all in one package.
This offer includes support services to kick-start your migration, and access to our SQL Server Essentials for the Oracle Database Administrator training. Dive into key features of SQL Server through hands-on labs and instructor-led demos, and learn how to deploy your applications—on-premises or in the cloud.
Act now, training and services are only available through June 30, 2016.

Claim your offer:

  1. Engage your account executive to begin the process
  2. Identify application workloads to migrate and specify the SQL Server cores required
  3. Enroll in a new/renewed Server Cloud Enrollment (SCE) (SCE has a minimum requirement of 50 cores and requires entire SQL Server installation to carry SA coverage)
  4. Receive free licenses and services package to kick-start your migration (Software Assurance subscription is required and services only available through June 30, 2016)
  5. Prove migration via deployment report, letter from CIO, or partner proof of execution (POE)

Managed Solution’s Team has the experience and expertise to architect SQL database and reporting systems tailored for your environment. Contact us for more information 800-307-0296


[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]SQL server 2016 - managed solution

Five Reasons You'll Love Everything Built in to SQL Server 2016

Check out the new SQL Server 2016—the first database that is born in the cloud, setting a new standard for the pace of innovation.
Five reasons to love the new SQL Server 2016:
  1. Industry leader in mission-critical - SQL Server 2016 delivers breakthrough mission-critical capabilities in scalability, performance, and availability for your most important OLTP and data warehousing workloads.
    • Scale up to 12 TB of memory and 640 logical processors with Windows Server 2016
    • Reach up to 30x faster transactions and 100x faster queries with enhanced in-memory performance
    • Run real-time Operational Analytics over transactional data
    • Balance loads across readable secondaries in Always On availability groups
  2. Most secure database - Our multi-layered approach to security has a proven track record of producing the least vulnerable database, even while being the most widely-used database on the planet.
    • Rely on a database with the least vulnerabilities of any major platform—six years in a row
    • Protect data at rest and in motion with TDE and new Always Encrypted
    • Mask sensitive data with minimal application impact using Dynamic Data Masking
    • Grant access based on user characteristics with Row Level Security
  3. Comprehensive business intelligence - SQL Server 2016 delivers a comprehensive, on-premises, enterprise-ready BI platform that helps you to transform complex data into actionable insights.
    • Create modern reports and visualize dense data with additional chart types
    • Access KPIs and mobile and paginated reports using the SQL Server Reporting Services web portal
    • Get faster SQL Server Analysis Services performance with parallel processing
    • Use enhanced multidimensional models in SQL Server Analysis Services
    • Easily set up SQL Server Data Tools Preview in Visual Studio 2015
  4. In-database advanced analytics - Built-in advanced analytics provide scalability and performance for building and running your advanced analytics algorithms directly in the core SQL Server transactional database.
    • Transform complex data from multiple sources into trusted data models using the most popular statistical modelling language
    • Process analytics in-place, reducing latencies and operational costs
    • Write models once and deploy anywhere: in-database, to the cloud, or to Linux, Hadoop, and Teradata
    • Access thousands of R Scripts and Models in CRAN (Common R Archival Network)
  5. Consistent experience from on-premises to cloud - SQL Server 2016 delivers a consistent experience both on-premises and cloud. You get an exceptional experience whether data is in your datacenter, in a private cloud, or on Azure.
    • Dynamically stretch warm and cold data to Azure with Stretch Database
    • Tackle your mission-critical workloads with larger Azure virtual machine sizes
    • Rely on our cloud-first features, tested by millions of Azure databases
    • Use the skills you already have with common development and management tools and common T-SQL everywhere

Source: microsoft.com

Managed Solution’s Team has the experience and expertise to architect SQL database and reporting systems tailored for your environment. Contact us for more information 800-307-0296


 

Managed Solution is a full-service technology firm that empowers business by delivering, maintaining and forecasting the technologies they’ll need to stay competitive in their market place. Founded in 2002, the company quickly grew into a market leader and is recognized as one of the fastest growing IT Companies in Southern California.

We specialize in providing full managed services to businesses of every size, industry, and need.[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]

Managed Solution’s Team has the experience and expertise to architect SQL database and reporting systems tailored for your environment. Contact us for more information 800-307-0296


[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]cloud computing for small businesses - managed solution

Small and midsize business can now get back up and recovery levels that were once available only to large, deep-pocketed organizations.
Cloud services are gaining fresh attention for their business-continuity advantages. One basic draw is the redundant nature of the cloud platform itself. Cloud services run on geographically diverse server clusters, in which a downed node automatically fails over to a live one. Users keep on working, unaffected and none the wiser.
Avoiding the steep costs of downtime is helping small and midsize businesses. The cloud means that for most companies, the costly, inefficient days of maintaining a redundant hot site for disaster recovery are behind them.

DOWNLOAD-WHITEPAPER-BUTTON-MANAGED-SOLUTION


[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]

SQL Server 2016 is here - managed solutionSQL Server 2016: The database for mission-critical intelligence

Joseph Sirosh - Corporate Vice President, Data Group, Microsoft as written on blogs.microsoft.com
The world around us, every business and nearly every industry, is being transformed by technology. This disruption is driven, in part, by the intersection of three trends: a massive explosion of data, intelligence from machine learning and advanced analytics, and the economics and agility of cloud computing.
While databases power nearly every aspect of business today, they were not originally designed with this disruption in mind. Traditional databases were about recording and retrieving transactions such as orders and payments very reliably, very securely and efficiently. They were designed to enable reliable, secure, mission-critical transactional applications at small to medium scale, in on-premises datacenters.
Databases built to get ahead of today’s disruptions do very fast analyses of live data in-memory as transactions are being recorded or queried. They support very low latency advanced analytics and machine learning, such as forecasting and predictive models, on the same data, so that applications can easily embed data-driven intelligence. They allow databases to be offered as a fully managed service in the cloud, in turn making it easy to build and deploy intelligent Software as a Service (SaaS) apps.
They also provide innovative security features built for a world where a majority of data is accessible over the Internet. They support 24×7 high-availability, efficient management and database administration across platforms. They therefore enable mission critical intelligent applications to be built and managed both in the cloud and on-premises. They are exciting harbingers of a new world of ambient intelligence.
We built SQL Server 2016 for this new world, and to help businesses get ahead of today’s disruptions. It supports hybrid transactional/analytical processing, advanced analytics and machine learning, mobile BI, data integration, always encrypted query processing capabilities and in-memory transactions with persistence. It is also perhaps the world’s only relational database to be “born cloud-first,” with the majority of features first deployed and tested in Azure, across 22 global datacenters and billions of requests per day. It is customer tested and battle ready.
Let me share with you what industry analysts and our customers think.
Industry analysts recognize the breadth and depth of our capabilities in data, intelligence and the cloud. Microsoft is the only company recognized as a leader across data platforms and cloud by Gartner in both vision and execution, in database, business intelligence, advanced analytics, data warehouse, cloud infrastructure and cloud application platforms.
The customers we’ve been working with in preview share our excitement and are already benefiting from new innovations, such as built-in analytics.
PROS Holdings, Inc. (NYSE: PRO) is a revenue and profit realization company that helps B2B and B2C customers achieve their business goals through data science. Royce Kallesen, senior director of Science and Research at PROS says: “Microsoft R’s parallelization and enhanced memory management on the server integrated with SQL Server provides much faster results on a common platform with built-in security.” They have realized over 100x faster advanced analytics using SQL Server and built-in Microsoft R Server.
In addition to faster analytics, the real-time in-memory processing capabilities of SQL Server are industry leading. This technology allows up to 100x faster analytics with updatable in-memory columnstores. In addition, as the only commercial database that leads simultaneously in both transaction processing (per the TPC-E benchmark) and data warehousing (per the TPC-H benchmark), SQL Server allows customers to realize incredible performance against massive data sets and gain real-time insights – across all workloads, new and existing applications.
“KPMG observed approximately 60 percent reduction in execution time and 10x table-compression gains for one of the main analytical procedures by leveraging Columnstore Indexes and Parallel Insert functions in SQL Server 2016,” says Michael S. Sellman, executive director, Global IT Services, KPMG LLP.
According to Chris Stolte, co-founder and chief development offficer of Tableau, Inc., “an average 190%+ interactive query performance improvement enables our customers to visually explore large datasets in real-time, even against transactional databases.”
With unique hybrid capabilities, any SQL Server deployment or app can span private clouds, hosted clouds and our public cloud, Microsoft Azure. New Stretch Database technology allows customers to dynamically, transparently and securely stretch their transactional data to Azure, creating a massive database with great price performance. Customers can also use new AlwaysOn Availability Groups to enable disaster recovery at low cost.
Security has never been more important, and we’re humbled to be the industry’s least vulnerable database, six years running, according to the National Institute of Standards and Technology (NIST) public security board. Several capabilities in SQL Server 2016 help protect data at rest and in memory (Always Encrypted), encrypt all user data with low performance overhead (Transparent Data Encryption), mitigate attacks with support for Transport Layer Security version 1.2, and allow developers to build applications that restrict access and protect data from specific users with Dynamic Data Masking (DDM) and Row Level Security (RLS).
DocuSign helps organizations build entire approval workflows without a single sheet of paper or filing cabinet in sight, so security and reliability are critical, as they are with every business today. Docusign partnered with Microsoft to help secure their customers’ data, realize insights with SQL Server analytics and BI capabilities and receive world-class support. Hear directly from their Chief Architect and Vice President of Platforms, Eric Fleischman:

With all of these capabilities built-in, SQL Server 2016 delivers not just a relational database, but an entire data platform for your business with incredible TCO. Today customers can save up to $10 million over three years versus Oracle, running transactional, data warehouse, data integration, business intelligence and advanced analytics workloads.* Judson Althoff, president of Microsoft North America, announced a new program to help more customers adopt SQL Server 2016 and save. Specifically, customers currently running applications or workloads on non-Microsoft paid commercial RDBMS platform will be able to migrate their existing applications with free SQL Server licenses.**
Microsoft is delivering on a vision that no other company can match across data, intelligence and cloud. To learn more, watch the webcast as well as various on-demand videos that showcase the new capabilities of this database built for mission-critical intelligence.

Managed Solution’s Team has the experience and expertise to architect SQL database and reporting systems tailored for your environment. Contact us for more information 800-307-0296


Managed Solution is a full-service technology firm that empowers business by delivering, maintaining and forecasting the technologies they’ll need to stay competitive in their market place. Founded in 2002, the company quickly grew into a market leader and is recognized as one of the fastest growing IT Companies in Southern California.

 

We specialize in providing full Microsoft solutions to businesses of every size, industry, and need.[/vc_column_text][/vc_column][/vc_row]

Contact us Today!

Chat with an expert about your business’s technology needs.