New blog launches: Microsoft AppSource, your destination for SaaS business apps

For anyone interested in creating, consuming or providing services for SaaS business apps on the Microsoft Cloud platform, Microsoft AppSource is the place to go.
AppSource is Microsoft’s answer to a longstanding market need to connect business customers with SaaS apps and integration services, and it’s already become a thriving ecosystem that gives business users one place to discover and trial apps built on the Microsoft Cloud platform, with implementation help from Microsoft partners.
To provide more information about AppSource and its growing collection of apps, ISVs, systems integrators and customers, the team is adding another dimension to the service this week — launching a new blog. The blog will feature resources to help participants get the most from AppSource, along with the stories behind the apps and how they’re bringing more value to customers.
Since the launch of AppSource in July, business users from all over the world have visited to check out the service. ISVs are submitting new apps every day, and already more than 100 systems integrator partners have come onboard to help customers implement them.
The new blog adds a forum to share information and stories as this resource continues to grow. It’s also a blog you’ll probably want to bookmark — with so much going on, we’re sure AppSource is going to have a lot to share in the near future.

[vc_row][vc_column][vc_column_text]

New Azure services help more people realize the possibilities of big data

By T. K. “Ranga” Rengarajan as written on blogs.microsoft.com
This week in San Jose thousands of people are at Strata + Hadoop World to explore the technology and business of big data. As part of our participation in the conference, we are pleased to announce new and enhanced Microsoft data services: a preview of Azure HDInsight running on Linux, the general availability of Storm on HDInsight, the general availability of Azure Machine Learning, and the availability of Informatica technology on Azure.
These new services are part of our continued investment in a broad portfolio of solutions to unlock insights from data. They can help businesses dramatically improve their performance, enable governments to better serve their citizenry, or accelerate new advancements in science. Our goal is to make big data technology simpler and more accessible to the greatest number of people possible: big data pros, data scientists and app developers, but also everyday businesspeople and IT managers. Azure is at the center of our strategy, offering customers scale, simplicity and great economics. And we’re embracing open technologies, so people can use the tools, languages and platforms of their choice to pull the maximum value from their data.
Simply put, we want to bring big data to the mainstream.
Azure HDInsight, our Apache Hadoop-based service in the cloud, is a prime example. It makes it easy for customers to crunch petabytes of all types of data with fast, cost-effective scale on demand, as well as programming extensions so developers can use their favorite languages. Customers like Virginia Tech, Chr. Hanson, Mediatonic and many others are using it to find important data insights. And, today, we are announcing that customers can run HDInsight on Ubuntu clusters (the leading scale-out Linux), in addition to Windows, with simple deployment, a managed service level agreement and full technical support. This is particularly compelling for people that already use Hadoop on Linux on-premises like on Hortonworks Data Platform, because they can use common Linux tools, documentation, and templates and extend their deployment to Azure with hybrid cloud connections.

ubuntu_msft_2_18_new

 

 

 

 

 

 

 

 

 

Storm for Azure HDInsight, generally available today, is another example of making big data simpler and more accessible. Storm is an open source stream analytics platform that can process millions of data “events” in real time as they are generated by sensors and devices. Using Storm with HDInsight, customers can deploy and manage applications for real-time analytics and Internet-of-Things scenarios in a few minutes with just a few clicks. Linkury is using HDInsight with Storm for its online monetization services, for example. We are also making Storm available for both .NET and Java and the ability to develop, deploy, and debug real-time Storm applications directly in Visual Studio. That helps developers to be productive in the environments they know best.
You can read this blog to learn about these and other updates we’re making to HDInsight to make Hadoop simpler and easier to use on Azure.
Azure Machine Learning, also generally available today, further demonstrates our commitment to help more people and organizations use the cloud to unlock the possibilities of data. It is a first-of-its-kind, managed cloud service for advanced analytics that makes it dramatically simpler for businesses to predict future trends with data. In mere hours, developers and data scientists can build and deploy apps to improve customer experiences, predict and prevent system failures, enhance operational efficiencies, uncover new technical insights, or a universe of other benefits. Such advanced analytics normally take weeks or months and require extensive investment in people, hardware and software to manage big data. Also, now developers – even those without data science training – can use the Machine Learning Marketplace to find APIs and finished services, such as recommendations, anomaly detection and forecasting, in order to deploy solutions quickly. Already customers like Pier 1, Carnegie Mellon, eSmart Systems, Mendeley and ThyssenKrupp are finding value in their data with Azure Machine Learning.

Azure Machine Learning reflects our support for open source. The Python programming language is a first class citizen in Azure Machine Learning Studio, along with R, the popular language of statisticians. New breakthrough algorithms, such as “Learning with Counts,” now allow customers to learn from terabytes of data. A new community gallery allows data scientists to share experiments via Twitter and LinkedIn, too. You can read more about these innovations and how customers are using Azure Machine Learning in this blog post.
Another key part of our strategy is to offer customers a wide range of partner solutions that build on and extend the benefits of Azure data services. Today, data integration leader Informatica is joining the growing ecosystem of partners in the Azure Marketplace. The Informatica Cloud agent is now available in Linux and Windows virtual machines on Azure. That will enable enterprise customers to create data pipelines from both on-premises systems and the cloud to Azure data services such as Azure HDInsight, Azure Machine Learning, Azure Data Factory and others, for management and analysis.
The value provided by our data services multiplies when customers use them together. A case in point is Ziosk, maker of the world’s first ordering, entertainment and pay-at-the table tablet. They are using Azure HDInsight, Azure Machine Learning, our Power BI analytics service and other Microsoft technologies to help restaurant chains like Chili’s drive guest satisfaction, frequency and advocacy with data from tabletop devices in 1,400 locations.
This week the big data world is focused on Strata + Hadoop World, a great event for the industry and community. It’s exciting to consider the new ideas and innovations happening around the world every day with data. Here at Microsoft, we’re thrilled to be part of it and to fuel that innovation with data solutions that give customers simple but powerful capabilities, using their choice of tools and platforms in the cloud.

[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]

Introducing #AzureAD Pass-Through Authentication and Seamless Single Sign-on

By Alex Simons as written on blogs.technet.microsoft.com
Howdy folks,
Today’s news might well be our biggest news of the year. Azure AD Pass-Through Authentication and Seamless Single Sign-on are now both in public preview!
When we talk to organizations about how they want to integrate their identity infrastructure to the cloud, we often hear the same set of requirements: “I’ve got to have single sign-on for my users, passwords need to stay on-premises, and I can’t have any un-authenticated end points on the Internet. And make sure it is super easy”.
We heard your feedback, and now the wait is over. I’m excited to announce we have added a set of new capabilities in Azure AD to meet all those requirements: Pass-Through Authentication and Seamless Single Sign-on to Azure AD Connect! These new capabilities allow customers to securely and simply integrate their on-premises identity infrastructure with Azure AD.

Azure AD pass-through authentication

Azure AD pass-through authentication provides a simple, secure, and scalable model for validation of passwords against your on-premises Active Directory via a simple connector deployed in the on-premises environment. This connector uses only secure outbound communications, so no DMZ is required, nor are there any unauthenticated end points on the Internet.
That’s right. User passwords are validated against your on-premises Active Directory, without needing to deploy ADFS servers!
We also automatically balance the load between the set of available connectors for both high availability and redundancy without requiring additional infrastructure. We made the connector super light-weight so it can be easily incorporated into your existing infrastructure and even deployed on your Active Directory controllers.
The system works by passing the password entered on the Azure AD login page down to the on-premises connector. That connector then validates it against the on-premises domain controllers and returns the results. We’ve also made sure to integrate with self-service password reset (SSPR) so that, should the user need to change their password, it can be routed back to on-premises for a complete solution. There is absolutely no caching of the password in the cloud. Find more details about this process in our documentation.

Seamless single sign-on for all

Single sign-on is one of the most important aspects of the end-user experience our customers think through as they move to cloud services. You need more than just single sign-on for interactions between cloud services – you also need to ensure users won’t have to enter their passwords over and over again.
With the new single sign-on additions in Azure AD Connect you can enable seamless single sign-on for your corporate users (users on domain joined machines on the corporate network). In doing so, users are securely authenticated with Kerberos, just like they would be to other domain-joined resources, without needing to type passwords.
The beauty of this solution is that it doesn’t require any additional infrastructure on-premises since it simply uses your existing Active Directory services. This is also an opportunistic feature in that if, for some reason, a user can’t obtain a Kerberos ticket for single sign-on, they will simply be prompted for their password, just as they are today. It is available for both password hash sync and Azure AD pass-through authentication customers. Read more on seamless single sign-on in this documentation article

Enabling these new capabilities

Download the latest version of Azure AD Connect now to get these new capabilities! You’ll find the new options in a custom install for new deployments, or, for existing deployments, when you change your sign-in method.

clip_image002_thumb2

The fine print

As with all previews there are some limits to what we currently support. We are working hard to ensure we provide full support across all systems. You can find the full list of supported client and operating systems in the documentation, which we’ll be updating consistently as things change.
Also, keep in mind that this is an authentication feature, so it’s best to try it out in a test environment to ensure you understand the end-user experience and how switching from one sign-on method to another will change that experience.
And last but by no means least, it’s your feedback that pushes us to make improvements like this to our products, so keep it coming. I look forward to hearing what you think!
Best regards,
Alex Simons

[/vc_column_text][/vc_column][/vc_row]

elasticsearch on azure - managed solution

Guidance for running Elasticsearch on Azure

By Masashi Narumoto as written on azure.microsoft.com
Elasticsearch is a scalable open source search engine and database that has been gaining popularity among developers building cloud-based systems. When suitably configured, it is capable of ingesting and efficiently querying large volumes of data very rapidly.
It’s reasonably straightforward to build and deploy an Elasticsearch cluster to Azure. You can create a set of Windows or Linux VMs, then download the appropriate Elasticsearch packages to install it on each VM. Alternatively, we published an ARM template you can use with the Azure portal to automate much of the process.
Elasticsearch is highly configurable, but we’ve witnessed many systems where a poor selection of options has led to slow performance. One reason for this is that there are many factors you need to take into account in order to achieve the best throughput and most responsive system, including:

•The cluster topology (client nodes, master nodes and data nodes)
•The structure of each index (the number of shards and replicas to specify)
•The virtual hardware (disk capacity and speed, amount of memory, number of CPUs)
•The allocation of resources on each cluster (disk layout, Java Virtual Machine memory usage, Elasticsearch queues and threads, I/O buffers)

You cannot consider these items in isolation, because the nature of workloads you are running will also have great bearing on the performance of the system. An installation optimized for data ingestion might not be well-tuned for queries, and vice versa. Therefore, you need to balance the requirements of the different operations your system needs to support. For these reasons, we spent considerable time working through a series of configurations, performing numerous tests and analyzing the results.
The purpose was to illustrate how you can design and build an Elasticsearch cluster to meet your own requirements, and to show how you can test and tune performance. This guidance is now available in Azure documentation. We provided a series of documents covering:
•General guidance on Elasticsearch, describing the configuration options available and how you can apply them to a cluster running on Azure
•Specific guidance on deploying, configuring, and testing an Elasticsearch cluster that must support a high level of data ingestion operations
•Guidance and considerations for Elasticsearch systems that must support mixed workloads and/or query-intensive systems
We used Apache JMeter to conduct performance tests and incorporated JUnit tests written using Java. Then we captured the performance data as a set of CSV files and used Excel to graph and analyze the results. We also used Elasticsearch Marvel to monitor systems while the tests were running.
If you'd like to repeat these tasks on your own setup, the documentation provides instructions on how to create your own JMeter test environment and gather performance information from Elasticsearch, in addition to providing scripts to run our JMeter tests.

[vc_row][vc_column][vc_column_text]

Azure Site Recovery & Backup

As statistics go, it’s telling.  Ninety percent of executives recently surveyed agreed that they needed a business continuity and disaster recovery (BCDR) plan. Is your organization one of the 90 percent still without a BCDR plan? If so, we can help.

Drive Business Results Through Microsoft Azure Site Backup & Recovery (ASR)

Simple, Automated Protection: With Azure Site Recovery, protect Hyper-V, VM Ware, and even physical servers. Orchestrated recovery of services in the event of a site outage at the primary data center. Create multiple recovery plans to fail over only certain applications when you have a particular failure in your data center. Test Recovery with Confidence. The Test Fail-over feature ensures you have confidence in the recovery solution and meets SLAs for your business. Perform planned fail overs with zero loss of data when you know about a disaster situation in advance.

capabilities of BCDR plan

Did you know...

According to research by the University of Texas, only 6% of companies suffering from a catastrophic data loss survive, while 43% never reopen and 51% close within two years. ASSESS, ENABLE, and CAPTURE with your business' Azure Site Recovery plan. Call 800-208-3617 to get started!


[/vc_column_text][/vc_column][/vc_row]

New Windows 10 upgrade benefits for Windows Cloud Subscriptions in CSP

By Nic Fillingham as written on blogs.windows.com

We’re excited to announce that customers with Windows subscriptions via the Cloud Solution Provider (CSP) program can now upgrade their Windows 7 and Windows 8.1 PCs and devices to Windows 10 at no additional cost.
This means customers subscribed to Windows 10 Enterprise E3 and E5 as well as Secure Productive Enterprise E3 and E5, can now upgrade their Windows 7 and Windows 8.1 PCs and devices to Windows 10 without the need to purchase separate upgrade licenses.
This is an important benefit addition to Windows cloud subscriptions in CSP as it enables customers who have yet to purchase a new Windows 10 device, or who missed out on the free upgrade to Windows 10 campaign, to take advantage of enterprise-grade security, managed by a trusted partner, for the price of coffee and a donut.
In order to take advantage of this new upgrade benefit, tenant admins for customers with Windows cloud subscriptions can log in to the Office 365 Admin center http://portal.office.com with their Azure Active Directory admin credentials and see options to begin the upgrade on the device they are currently using, share the download link with others in their organization, create installation media or troubleshoot installation.
The Windows 10 upgrade licenses issued as part of this process are perpetual and associated with the device. This means the license will not expire or be revoked if the customer chooses to end their Windows cloud subscription in the CSP program.

Admin-center-screenshot-01-1024x407

The new upgrade benefits are rolling out now and tenant admins with Windows subscriptions in CSP should start to see Windows 10 upgrade options and links in their Office 365 Admin center over the next 48 hours.
We hope these new Windows 10 upgrade benefits will better enable businesses of any size – including those with PCs and devices still on Windows 7 and Windows 8.1 – to work with a trusted partner to upgrade to enterprise-grade security and management with flexible, small business pricing from just $7 per user, per month.

By Pallavi Joshi as written on azure.microsoft.com

More and more customers are hit with security issues. These security issues result in data loss and the cost of security breach has been ever increasing. Despite having security measures in place, organizations face cyber threats because of vulnerabilities exposed by multiple IT systems. All these and many such data points pose very strong questions – Are your organization’s IT applications and data safe? What is the cost of recovering from the huge business impact in case of cyber attacks? If you have a backup strategy in place, are your cloud backups secure?

Currently, there are over 120 separate ransomware families, and we’ve seen a 3500% increase in cybercriminal internet infrastructure for launching attacks since the beginning of the year” points out a recent CRN Quarterly Ransomware Report. To mitigate the threat of such attacks, FBI recommends users to regularly backup data and to secure backups in the cloud. This blog talks about Security Features in Azure Backup that help secure hybrid backups.

Value proposition

Malware attacks that happen today, target production servers to either re-encrypt the data or remove it permanently. Also, if production data is affected, the network share as well as backups are also affected, which can lead to data loss or data corruption. Hence, there is a strong need to protect production as well as backup data against sophisticated attacks and have a strong security strategy in place to ensure data recoverability.

Azure Backup now provides security capabilities to protect cloud backups. These security features ensure that customers are able to secure their backups and recover data using cloud backups if production and backup servers are compromised.  These features are built on three principles – Prevention, Alerting and Recovery – to enable organizations increase preparedness against attacks and equip them with a robust backup solution.Azure Backup Security Principles

Features

  1. Prevention: New authentication layer added for critical operations like Delete Backup Data, Change Passphrase. These operations now require Security PIN available only to users with valid Azure credentials.
  2. Alerting: Email notifications are sent for any critical operations that impact availability of backup data. These notifications enable users to detect attacks as soon as they occur.
  3. Recovery: Azure backup retains deleted backup data for 14 days ensuring recovery using any old or recent recovery points. Also, minimum number of recovery points are always maintained such that there are always sufficient number of points to recover from.

Getting started with security features

To start leveraging these features, navigate to recovery services vault in the Azure portal and enable them. The video below explains how to get started by enabling these features and how to leverage them in Azure Backup.

[vc_row][vc_column][vc_column_text]

rob-bernard-four-green-tech-predictionsFour Green Tech Predictions for 2017

Written by Rob Bernard as seen on blogs.microsoft.com
The end of the calendar year is a traditional time of reflection, of the ups and downs of the past year, and to think about what to expect in 2017. As Microsoft’s chief environmental strategist, I am encouraged by the progress made on some environmental issues in the past year, but there is still much work to be done. Despite the increasing urgency around many environmental issues, I remain optimistic about the future.
Perhaps the most notable breakthrough this past year was that the Paris Agreement on climate change entered into force. Cities, countries and companies around the world are now focusing their efforts on how to set and execute their plans to reduce carbon emissions. We also saw growth in corporate procurement of renewable energy in 2016, both in the U.S. and around the globe, another encouraging sign. At Microsoft, we put forth our own goal to source 50% of our datacenter electricity from wind, solar and hydropower. At the end of this year, we’re happy to report that we are on pace to not only meet our goal, but also are creating new financial and technology models that can further accelerate the pace of the adoption of renewable energy.
As we look towards 2017, I expect that we will see both continued progress on energy and an increasing focus on non-energy areas. As we at Microsoft think about 2017, I think we expect to see some shifts in approaches and investments happening across the world.

1. IoT and Cloud Computing will begin to transform utility energy management:

Aging infrastructure is already under pressure and the addition of more renewable energy will only compound the stress on existing infrastructure. As more clean energy comes online, along with distributed resources like electric vehicles and rooftop solar, utilities are facing a big challenge – how to manage a more complex network of energy creating and energy storing devices.  2017 will see an increased investment by utilities in technology to leverage data, through IoT solutions and cloud computing, to make energy management more predictable, flexible and efficient.
In developing nations, we are seeing a different trend, but one that is also accelerated by IoT and cloud computing. In these markets, data is being used to accelerate distribution, sales and management of micro-solar grids to enable households to get both power and connectivity to the internet. 2017 should be an exciting year with even more growth as capital investments in these markets increase and solar and battery storage prices decline.

2. Water will begin to emerge as the next critical world-scale environmental challenge

leaf-rain-raindrops-drop-of-water
Water scarcity is increasing in many areas around the world. Our oceans are under threat from pollution, acidification and warming temperatures. We are already seeing the devastating effects on iconic landmarks like the Great Barrier Reef. And these trends are putting peoples’ food, water, and livelihoods at risk. In 2017, awareness on this challenge will increase. We will begin to better understand what is happening to our water through the use of sensors and cloud computing. Our ability to leverage technologies like advanced mapping technologies and sensors will increase and expand our understanding of what is driving the decline of many of our critical water systems.

3. Data will increasingly be used to try to better understand our planet

Data is essential for monitoring and managing the earth’s resources and fragile ecosystems. There is much we do not understand about the planet, but we see an increasing number of companies and investments flowing toward developing tools and platforms that enable better mapping and understanding of earth’s carbon storage and air borne gasses, and ecosystems and the associated value they provide. We expect to see data being applied more proactively to create a more actionable understanding of how we can better manage food, water, biodiversity and climate change.

4. Organizations and policy makers will start leveraging cloud-based technologies

This area is perhaps the most difficult to try to predict. While countries will begin implementing their action plans under the Paris Agreement, it is not easy to predict the methods each country will use and prioritize to make progress against commitments under the Paris Agreement. And the changes will happen not just at the national level. Increasingly we will see cities and local governments moving ahead with technology implementation to drive efficiencies and accountability, along with policy changes as well. We’re already leveraging machine learning and artificial intelligence to better model and understand the potential impact of legislative changes, in addition to offering our technologies to our public sector partners as they work towards their plans. While this will likely take several years to take hold, 2017 should see an increased awareness for the role of technology in environmental policy
While there are many challenges ahead across so many areas of sustainability, I remain optimistic.  The number of people and organizations that are focusing on these and many other areas of sustainability ensure that we will continue to make progress in 2017.  At Microsoft, we are committed to working with our customers and partners to help them achieve more through the use of our technologies. Everyone – companies, countries, individuals – have much work to do. We believe that by working together, we can help develop effective solutions to address environmental challenges that will benefit our business, our customers and the planet. And we are up to the challenge.

[/vc_column_text][/vc_column][/vc_row]

Contact us Today!

Chat with an expert about your business’s technology needs.