[vc_row][vc_column][vc_column_text]

New Azure services help more people realize the possibilities of big data

By T. K. “Ranga” Rengarajan as written on blogs.microsoft.com
This week in San Jose thousands of people are at Strata + Hadoop World to explore the technology and business of big data. As part of our participation in the conference, we are pleased to announce new and enhanced Microsoft data services: a preview of Azure HDInsight running on Linux, the general availability of Storm on HDInsight, the general availability of Azure Machine Learning, and the availability of Informatica technology on Azure.
These new services are part of our continued investment in a broad portfolio of solutions to unlock insights from data. They can help businesses dramatically improve their performance, enable governments to better serve their citizenry, or accelerate new advancements in science. Our goal is to make big data technology simpler and more accessible to the greatest number of people possible: big data pros, data scientists and app developers, but also everyday businesspeople and IT managers. Azure is at the center of our strategy, offering customers scale, simplicity and great economics. And we’re embracing open technologies, so people can use the tools, languages and platforms of their choice to pull the maximum value from their data.
Simply put, we want to bring big data to the mainstream.
Azure HDInsight, our Apache Hadoop-based service in the cloud, is a prime example. It makes it easy for customers to crunch petabytes of all types of data with fast, cost-effective scale on demand, as well as programming extensions so developers can use their favorite languages. Customers like Virginia Tech, Chr. Hanson, Mediatonic and many others are using it to find important data insights. And, today, we are announcing that customers can run HDInsight on Ubuntu clusters (the leading scale-out Linux), in addition to Windows, with simple deployment, a managed service level agreement and full technical support. This is particularly compelling for people that already use Hadoop on Linux on-premises like on Hortonworks Data Platform, because they can use common Linux tools, documentation, and templates and extend their deployment to Azure with hybrid cloud connections.

ubuntu_msft_2_18_new

 

 

 

 

 

 

 

 

 

Storm for Azure HDInsight, generally available today, is another example of making big data simpler and more accessible. Storm is an open source stream analytics platform that can process millions of data “events” in real time as they are generated by sensors and devices. Using Storm with HDInsight, customers can deploy and manage applications for real-time analytics and Internet-of-Things scenarios in a few minutes with just a few clicks. Linkury is using HDInsight with Storm for its online monetization services, for example. We are also making Storm available for both .NET and Java and the ability to develop, deploy, and debug real-time Storm applications directly in Visual Studio. That helps developers to be productive in the environments they know best.
You can read this blog to learn about these and other updates we’re making to HDInsight to make Hadoop simpler and easier to use on Azure.
Azure Machine Learning, also generally available today, further demonstrates our commitment to help more people and organizations use the cloud to unlock the possibilities of data. It is a first-of-its-kind, managed cloud service for advanced analytics that makes it dramatically simpler for businesses to predict future trends with data. In mere hours, developers and data scientists can build and deploy apps to improve customer experiences, predict and prevent system failures, enhance operational efficiencies, uncover new technical insights, or a universe of other benefits. Such advanced analytics normally take weeks or months and require extensive investment in people, hardware and software to manage big data. Also, now developers – even those without data science training – can use the Machine Learning Marketplace to find APIs and finished services, such as recommendations, anomaly detection and forecasting, in order to deploy solutions quickly. Already customers like Pier 1, Carnegie Mellon, eSmart Systems, Mendeley and ThyssenKrupp are finding value in their data with Azure Machine Learning.

Azure Machine Learning reflects our support for open source. The Python programming language is a first class citizen in Azure Machine Learning Studio, along with R, the popular language of statisticians. New breakthrough algorithms, such as “Learning with Counts,” now allow customers to learn from terabytes of data. A new community gallery allows data scientists to share experiments via Twitter and LinkedIn, too. You can read more about these innovations and how customers are using Azure Machine Learning in this blog post.
Another key part of our strategy is to offer customers a wide range of partner solutions that build on and extend the benefits of Azure data services. Today, data integration leader Informatica is joining the growing ecosystem of partners in the Azure Marketplace. The Informatica Cloud agent is now available in Linux and Windows virtual machines on Azure. That will enable enterprise customers to create data pipelines from both on-premises systems and the cloud to Azure data services such as Azure HDInsight, Azure Machine Learning, Azure Data Factory and others, for management and analysis.
The value provided by our data services multiplies when customers use them together. A case in point is Ziosk, maker of the world’s first ordering, entertainment and pay-at-the table tablet. They are using Azure HDInsight, Azure Machine Learning, our Power BI analytics service and other Microsoft technologies to help restaurant chains like Chili’s drive guest satisfaction, frequency and advocacy with data from tabletop devices in 1,400 locations.
This week the big data world is focused on Strata + Hadoop World, a great event for the industry and community. It’s exciting to consider the new ideas and innovations happening around the world every day with data. Here at Microsoft, we’re thrilled to be part of it and to fuel that innovation with data solutions that give customers simple but powerful capabilities, using their choice of tools and platforms in the cloud.

[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]

Introducing #AzureAD Pass-Through Authentication and Seamless Single Sign-on

By Alex Simons as written on blogs.technet.microsoft.com
Howdy folks,
Today’s news might well be our biggest news of the year. Azure AD Pass-Through Authentication and Seamless Single Sign-on are now both in public preview!
When we talk to organizations about how they want to integrate their identity infrastructure to the cloud, we often hear the same set of requirements: “I’ve got to have single sign-on for my users, passwords need to stay on-premises, and I can’t have any un-authenticated end points on the Internet. And make sure it is super easy”.
We heard your feedback, and now the wait is over. I’m excited to announce we have added a set of new capabilities in Azure AD to meet all those requirements: Pass-Through Authentication and Seamless Single Sign-on to Azure AD Connect! These new capabilities allow customers to securely and simply integrate their on-premises identity infrastructure with Azure AD.

Azure AD pass-through authentication

Azure AD pass-through authentication provides a simple, secure, and scalable model for validation of passwords against your on-premises Active Directory via a simple connector deployed in the on-premises environment. This connector uses only secure outbound communications, so no DMZ is required, nor are there any unauthenticated end points on the Internet.
That’s right. User passwords are validated against your on-premises Active Directory, without needing to deploy ADFS servers!
We also automatically balance the load between the set of available connectors for both high availability and redundancy without requiring additional infrastructure. We made the connector super light-weight so it can be easily incorporated into your existing infrastructure and even deployed on your Active Directory controllers.
The system works by passing the password entered on the Azure AD login page down to the on-premises connector. That connector then validates it against the on-premises domain controllers and returns the results. We’ve also made sure to integrate with self-service password reset (SSPR) so that, should the user need to change their password, it can be routed back to on-premises for a complete solution. There is absolutely no caching of the password in the cloud. Find more details about this process in our documentation.

Seamless single sign-on for all

Single sign-on is one of the most important aspects of the end-user experience our customers think through as they move to cloud services. You need more than just single sign-on for interactions between cloud services – you also need to ensure users won’t have to enter their passwords over and over again.
With the new single sign-on additions in Azure AD Connect you can enable seamless single sign-on for your corporate users (users on domain joined machines on the corporate network). In doing so, users are securely authenticated with Kerberos, just like they would be to other domain-joined resources, without needing to type passwords.
The beauty of this solution is that it doesn’t require any additional infrastructure on-premises since it simply uses your existing Active Directory services. This is also an opportunistic feature in that if, for some reason, a user can’t obtain a Kerberos ticket for single sign-on, they will simply be prompted for their password, just as they are today. It is available for both password hash sync and Azure AD pass-through authentication customers. Read more on seamless single sign-on in this documentation article

Enabling these new capabilities

Download the latest version of Azure AD Connect now to get these new capabilities! You’ll find the new options in a custom install for new deployments, or, for existing deployments, when you change your sign-in method.

clip_image002_thumb2

The fine print

As with all previews there are some limits to what we currently support. We are working hard to ensure we provide full support across all systems. You can find the full list of supported client and operating systems in the documentation, which we’ll be updating consistently as things change.
Also, keep in mind that this is an authentication feature, so it’s best to try it out in a test environment to ensure you understand the end-user experience and how switching from one sign-on method to another will change that experience.
And last but by no means least, it’s your feedback that pushes us to make improvements like this to our products, so keep it coming. I look forward to hearing what you think!
Best regards,
Alex Simons

[/vc_column_text][/vc_column][/vc_row]

elasticsearch on azure - managed solution

Guidance for running Elasticsearch on Azure

By Masashi Narumoto as written on azure.microsoft.com
Elasticsearch is a scalable open source search engine and database that has been gaining popularity among developers building cloud-based systems. When suitably configured, it is capable of ingesting and efficiently querying large volumes of data very rapidly.
It’s reasonably straightforward to build and deploy an Elasticsearch cluster to Azure. You can create a set of Windows or Linux VMs, then download the appropriate Elasticsearch packages to install it on each VM. Alternatively, we published an ARM template you can use with the Azure portal to automate much of the process.
Elasticsearch is highly configurable, but we’ve witnessed many systems where a poor selection of options has led to slow performance. One reason for this is that there are many factors you need to take into account in order to achieve the best throughput and most responsive system, including:

•The cluster topology (client nodes, master nodes and data nodes)
•The structure of each index (the number of shards and replicas to specify)
•The virtual hardware (disk capacity and speed, amount of memory, number of CPUs)
•The allocation of resources on each cluster (disk layout, Java Virtual Machine memory usage, Elasticsearch queues and threads, I/O buffers)

You cannot consider these items in isolation, because the nature of workloads you are running will also have great bearing on the performance of the system. An installation optimized for data ingestion might not be well-tuned for queries, and vice versa. Therefore, you need to balance the requirements of the different operations your system needs to support. For these reasons, we spent considerable time working through a series of configurations, performing numerous tests and analyzing the results.
The purpose was to illustrate how you can design and build an Elasticsearch cluster to meet your own requirements, and to show how you can test and tune performance. This guidance is now available in Azure documentation. We provided a series of documents covering:
•General guidance on Elasticsearch, describing the configuration options available and how you can apply them to a cluster running on Azure
•Specific guidance on deploying, configuring, and testing an Elasticsearch cluster that must support a high level of data ingestion operations
•Guidance and considerations for Elasticsearch systems that must support mixed workloads and/or query-intensive systems
We used Apache JMeter to conduct performance tests and incorporated JUnit tests written using Java. Then we captured the performance data as a set of CSV files and used Excel to graph and analyze the results. We also used Elasticsearch Marvel to monitor systems while the tests were running.
If you'd like to repeat these tasks on your own setup, the documentation provides instructions on how to create your own JMeter test environment and gather performance information from Elasticsearch, in addition to providing scripts to run our JMeter tests.

[vc_row][vc_column][vc_column_text]

Azure Site Recovery & Backup

As statistics go, it’s telling.  Ninety percent of executives recently surveyed agreed that they needed a business continuity and disaster recovery (BCDR) plan. Is your organization one of the 90 percent still without a BCDR plan? If so, we can help.

Drive Business Results Through Microsoft Azure Site Backup & Recovery (ASR)

Simple, Automated Protection: With Azure Site Recovery, protect Hyper-V, VM Ware, and even physical servers. Orchestrated recovery of services in the event of a site outage at the primary data center. Create multiple recovery plans to fail over only certain applications when you have a particular failure in your data center. Test Recovery with Confidence. The Test Fail-over feature ensures you have confidence in the recovery solution and meets SLAs for your business. Perform planned fail overs with zero loss of data when you know about a disaster situation in advance.

capabilities of BCDR plan

Did you know...

According to research by the University of Texas, only 6% of companies suffering from a catastrophic data loss survive, while 43% never reopen and 51% close within two years. ASSESS, ENABLE, and CAPTURE with your business' Azure Site Recovery plan. Call 800-208-3617 to get started!


[/vc_column_text][/vc_column][/vc_row]

Azure Backup security capabilities for protecting cloud backups

By Pallavi Joshi as written on azure.microsoft.com

More and more cloud customers are hit with security issues, as a result, awareness of the importance of Azure backup security is increasing rapidly.

These security issues result in data loss and the cost of security breach has been ever increasing. Despite having security measures in place, organizations face cyber threats because of vulnerabilities exposed by multiple IT systems.

All these and many such data points pose very strong questions – Are your organization’s IT applications and data safe? What is the cost of recovering from the huge business impact in case of cyber attacks? If you have a backup strategy in place, are your cloud backups secure?

Currently, there are over 120 separate ransomware families, and we’ve seen a 3500% increase in cybercriminal internet infrastructure for launching attacks since the beginning of the year” - CRN Quarterly Ransomware Report.

To mitigate the threat of such attacks, FBI recommends users regularly backup data and secure backups in the cloud. Continue reading today's blog to learn about Security Features in Azure Backup that help secure hybrid backups.

Value proposition

Malware attacks that happen today, target production servers to either re-encrypt the data or remove it permanently. Also, if production data is affected, the network share as well as backups are also affected, which can lead to data loss or data corruption.

Hence, there is a strong need to protect production as well as backup data against sophisticated attacks and have a strong security strategy in place to ensure data recoverability.

Azure Backup now provides security capabilities to protect cloud backups. These security features ensure that customers are able to secure their backups and recover data using cloud backups if production and backup servers are compromised.

These features are built on three principles – Prevention, Alerting and Recovery – to enable organizations increase preparedness against attacks and equip them with a robust backup solution.

Azure Backup Security Principles

Features

  1. Prevention: New authentication layer added for critical operations like Delete Backup Data, Change Passphrase. These operations now require Security PIN available only to users with valid Azure credentials.
  2. Alerting: Email notifications are sent for any critical operations that impact availability of backup data. These notifications enable users to detect attacks as soon as they occur.
  3. Recovery: Azure backup retains deleted backup data for 14 days ensuring recovery using any old or recent recovery points. Also, minimum number of recovery points are always maintained such that there are always sufficient number of points to recover from.

Getting started with security features

To start leveraging these features, navigate to recovery services vault in the Azure portal and enable them or simply contact one of our experts here.

sales-management-solution-template-dynamics-365-managed-solution

Announcing the Sales Management Solution Template for Dynamics 365 with Data Export

Written by Richard Tkachuk as seen on Microsoft Corporation
Back in May, we announced the sales management solution template that simplified and accelerated building powerful and compelling Power BI solutions on Dynamics CRM (now Dynamics 365). The sales management solution template offered a very fast guided experience to create compelling reports on an extensible, scalable, and secure architecture that could be customized however one needed. This meant that instead of spending one’s time on plumbing, one could instead spend it on extending and customizing the solution template to meet your organization’s needs. Today, I’m pleased to announce the integration of Dynamics CRM Data Export with the sales management solution template for Dynamics 365.
Data Export is a free add-on service made available as a Microsoft Dynamics 365 solution that adds the ability to replicate Dynamics 365 online data to a Microsoft Azure SQL Database store in a customer-owned Microsoft Azure subscription.
clip_image002
There’s no more scheduling – your data is automatically replicated from Dynamics 365 as soon as it changes.
With this new capability, customers can use the sales management solution template create an enterprise-ready Power BI analytics solution on data that is replicated to their Azure SQL database in a fast, robust, and scalable manner.
Provisioning the solution template is just as simple as before, but now it’s much, much faster with Data Export. You don’t need to do anything new – if you’re a Dynamics 365 administrator, just run through a couple of pages to let us know about your instance and we do everything. Once you’re done, all changes in your Dynamics 365 records are automatically updated in the Azure SQL database and available to your Power BI reports.
And while we were at it, we rejuvenated the Power BI reports to give them a clean new look – here’s what the new first page looks like:
clip_image004
I hope you can check it out – we’re very happy with the changes and hope you are as well.

new-azure-logic-apps-innovation-managed-solution

New Azure Logic Apps innovation – general availability of cloud-based Enterprise Integration Pack

Written by Frank Weigel as seen on Microsoft Corporation
Businesses are looking for more ways to reduce infrastructure costs without compromising service availability. This results in companies looking for newer cloud development architectures like serverless, giving rise to the need for event triggered integration across multiple third party services.  Developers are turning to serverless solutions like Azure Logic Apps and Azure Functions to automate workflows and integrate systems, thereby accelerating application delivery and reducing costs. Logic Apps enables customers to quickly and easily build powerful integration solutions using a visual designer and a wide set of out-of-the-box connectors such as Dynamics CRM, Salesforce, Office 365 and many more.
Today I am excited to announce another important milestone in integration- the general availability of Enterprise Integration Pack within Logic Apps, which further simplifies business-to-business (B2B) communications in the cloud. It enables you to more easily process business transactions reliably, track and troubleshoot B2B events and leverage additional out-of-the-box connectors.

Electronic Data Interchange (EDI) and Business-to-Business (B2B) transactions

With Enterprise Integration Pack, you can take advantage of a faster, more reliable and versatile B2B/EDI solution than traditional integration solutions. Integration accounts within Enterprise Integration Pack quickly create and manage cloud based B2B related artifacts such as maps, schemas, trading partners, agreements and certificates. With this release, electronic data interchange (EDI) has never been easier. You can send, receive and troubleshoot B2B transactions across a wide variety of protocols including  AS2, EDIFACT and X12. Customers like Mission Linen Supply are already realizing the benefits of EDI capabilities in Logic Apps:
“Today, with our Azure Logic Apps solution, we can get suppliers onboarded within two weeks versus the two months or longer that the [Electronic Data Interchange] provider required. The faster we can integrate partners, the faster we can grow our business.”
– Dave Pattison, Director of IT, Mission Linen Supply
Below is a view of the new integration account within Enterprise Integration Pack:
integrationaccount

Management capabilities

The ability to view and troubleshoot B2B events via system management solutions is as important as enabling comprehensive EDI capabilities. With the Enterprise Integration Pack, you can track B2B events in a number of flexible ways like built-in tracking which can be routed to Microsoft Operations Management Suite (OMS) using the out-of-the-box tracking portal. You can easily view and troubleshoot B2B transactions over AS2 and X12 formats (with EDIFACT coming in the next few weeks). Additionally, a new RESTful tracking API enables you to send tracking events from both Logic App executions as well as other applications for end to end visibility. You can also add and correlate tracking data across your entire business process in Operations Management Suite.
Here is a view of tracking B2B events through the Operations Management Suite portal:
Microsoft Operation Management Suite

Enterprise connectors

We realize that many of our customers are often dealing with mission-critical applications and business processes that can lead to complex, time consuming connection configuration steps. With Logic Apps, we have added more enterprise connectors that make it simple and fast to establish connections with business applications. For instance, with the SAP connector, you can easily connect your on-premises SAP systems to cloud applications using Logic Apps without the need for complex coding required by other serverless products in the market. Today, the MQ Series and SAP ECC connectors are in preview, with more connectors coming in the next few months. For a full list of all currently available connectors, please visit the Logic Apps connectors reference.
Get started today!
With the general availability of Enterprise Integration Pack, you can now start using these services in production with full SLA and support. We are committed to continuous delivery of serverless compute and integration capabilities and will continue to share updates about investments and new releases. In the meanwhile, learn more and try our serverless offerings – Azure Logic Apps and Azure Functions.

conin-managed-solution

CONIN: Finding, reaching and helping at-risk children through the cloud

As written on Microsoft.com
Mendoza, Argentina-based CONIN works to not only eradicate child malnutrition in Argentina but also to serve disadvantaged families at risk of falling through gaps in social services. With Microsoft Azure and other Microsoft cloud services, the nonprofit helps more children and families receive the resources they need to thrive.
Its previous, analog-based systems helped the nonprofit impact the lives of thousands of youth. But with an Azure—powered IT system, CONIN is now poised to improve family health across Argentina-and beyond, as they begin to expand their services in Latin America.

Smarter Data

As nonprofit leaders know, effectively tackling a problem begins with understanding it. So when CONIN set out to map malnutrition in Salta, a northern and remote province of Argentina, the family-focused organization turned to the flexible and comprehensive solutions of Azure.
By identifying the most pressing needs of different pockets of Salta—from lack of clean water to insufficient healthy food—CONIN input data into an application developed and hosted in Azure, which they then displayed and shared in Power BI. The result: an accurate and up-to-date visualization of on-the-ground realities. They could then partner with the regional government to pinpoint priorities and direct limited resources at the most urgent issues first. What’s more, this detailed map provided the data to develop a long-term solution: CONIN and the Salta government drafted public policy to prevent malnutrition in the future—and ensure the area’s children have a healthy start to life.
The impact of the Azure mapping solution does not end in Salta. CONIN is rolling out the system in other communities in Argentina as well as in parts of Latin America and Africa. And other nonprofits will save time and resources by transferring an already developed solution to their own Azure environment.

Connecting Communities

Like many nonprofits, CONIN relies on a combination of staff, volunteers and partners to carry out its mission. And although CONIN's dedicated team has already served 20,000 youth over several decades, leaders within the organization knew they could reach even more people in need by leveraging cloud-based tools.
CONIN, then, developed an app in Azure that triggers an alert whenever the nonprofit-and government-run community census identifies a child in need. Take, for example, an alarm CONIN staff received about Bryan, a cheerful and playful boy who was born with Down syndrome and severe kidney problems. His mother, who didn’t even know she was eligible for services, was invited to the nonprofit to begin the process for getting care for Bryan. CONIN paid for Bryan’s surgeries, arranged for transportation to and from medical visits and now enrolls Bryan in a special CONIN school. The system also allows CONIN staff to track the boy’s health with digital updates on his progress.
Without this alert app that automatically analyzes data to identify families in need, Bryan may never have gotten a fair shot at a happy, healthy future.

Field-to-office solutions

Proactive nonprofits not only help the at-risk families that seek them out but also bring in others who may never have known about their services. CONIN teams used to do this community outreach and information-gathering with inefficient paper surveys. Since receiving an Azure grant, they canvas Argentina's poorest neighborhoods with a digital, cloud-based polling solution.
“Today, the technology makes it much faster: It enables us to have every child in the system,” says Teresa Cornejo, president of a CONIN network member that addresses nutrition in Salta. By inputting information designed to identify families in danger of malnutrition, lack of education or unmet medical needs, the data is automatically synched to folders accessible anywhere—from CONIN's offices to employees traveling the dirt roads of Salta.
Canvassers go door-to-door with mobile phones or tablets, making the highly detailed data they collect immediately available on an Azure cloud platform. CONIN, other partner nonprofits and the government of Salta province use this up-to-the-minute information to work toward their goal: making malnutrition a problem of the past.

Contact us Today!

Chat with an expert about your business’s technology needs.