[vc_row][vc_column][vc_column_text]Eight New Service Offerings Azure Government Cloud

Eight new service offerings added to Azure Government certification scope

Written by Derek Strausbaugh as seen on blogs.msdn.microsoft.com
We are pleased to announce the addition of Azure Resource Manager, Automation, Azure Batch, Log Analytics, Azure Media Services, Policy Administration Service/RBAC, Redis Cache, and Scheduler to certification scope in Microsoft Azure Government.
Each of these service offerings has received Joint Authorization Board (JAB) approval for addition to Azure Government’s P-ATO at the High Impact Level.  With the addition of these eight offerings, the total number of Azure Government offerings that meet the FedRAMP High baseline grows to 26 services; 20 more services than AWS GovCloud.
These services may be used by Federal, DoD and state and local government customers and partners building solutions on Azure Government who are required to meet rigorous compliance standards such as FedRAMP High, DISA L4, CJIS, ITAR, and IRS 1075.   The Azure Blueprint program is designed to facilitate the secure and compliant use of these and other Azure Government service offerings by providing solution accelerators and guidance concerning customer security responsibilities when architecting solutions in Azure.

About these services

Azure Resource Manager – Azure Resource Manager (ARM) enables you to repeatedly deploy your app and have confidence your resources are deployed in a consistent state. You define the infrastructure and dependencies for your app in a single declarative template. This template is flexible enough to use for all of your environments such as test, staging or production.
You put resources with a common lifecycle into a resource group that can be deployed or deleted in a single action. You can see which resources are linked by a dependency. You can apply tags to resources to categorize them for management tasks, such as billing as well as control who in the organization can perform actions on the resources by defining roles for users and groups.  ARM logs all user actions so you can audit those actions.
Automation – Azure Automation uses Windows PowerShell scripts and workflows – known as runbooks – to handle the creation, deployment, monitoring, and maintenance of Azure resources and third-party applications.  Automation runbooks work with Web Apps in Azure App Service, Azure Virtual Machines (Windows or Linux), Azure Storage, Azure SQL Database, and any service that offers public Internet APIs.
Azure Batch – Azure Batch makes it easy to run large-scale parallel and high-performance computing (HPC) workloads in Azure. Use Batch to scale out parallel workloads, manage the execution of tasks in a queue, and cloud-enable applications to offload compute jobs to the cloud.
Log Analytics – Log Analytics is a service in Operations Management Suite that helps you collect and analyze data generated by resources in your cloud and on-premises environments. It gives you real-time insights using integrated search and custom dashboards to readily analyze millions of records across all of your workloads and servers regardless of their physical location.
Azure Media Services – Azure Media Services offers broadcast-quality video streaming services to reach larger audiences on today’s most popular mobile devices. With features that enhance accessibility, distribution, and scalability, Media Services makes it easy and cost-effective to stream and protect your content to audiences both local and worldwide.
Policy Administration Service/RBAC – Azure Role-Based Access Control (RBAC) enables fine-grained access management for Azure. Using RBAC, you can grant only the amount of access that users need to perform their jobs.
Redis Cache – Based on the popular open source Redis cache—Redis Cache gives you access to a secure, dedicated cache for your Azure application usage.  It leverages the low-latency, high-throughput capabilities of the Redis engine. This separate, distributed cache layer allows your data tier to scale independently for more efficient use of compute resources in your application layer.
Scheduler – Azure Scheduler lets you invoke actions that call HTTP/S endpoints or post messages to a storage queue on any schedule. You can use Scheduler to create jobs that reliably call services either inside or outside of Azure and run those jobs on demand, on a regular or irregular schedule, or at a future date.
Azure is dedicated to expanding the number of offerings available to government customers and will continue to provide updates through our blog as well as adding covered offerings to the Microsoft Trust Center.

[/vc_column_text][/vc_column][/vc_row]

Dividing cancer cell, SEMUsing data science to beat cancer

By Nancy Brinker as written on techcrunch.com
The complexity of seeking a cure for cancer has vexed researchers for decades. While they’ve made remarkable progress, they are still waging a battle uphill as cancer remains one of the leading causes of death worldwide.
Yet scientists may soon have a critical new ally at their sides —  intelligent machines — that can attack that complexity in a different way.
Consider an example from the world of gaming: Last year, Google’s artificial intelligence platform, AlphaGo, deployed techniques in deep learning to beat South Korea Grand Master Lee Sedol in the immensely complex game of Go, which has more moves than there are stars in the universe.
Those same techniques of machine learning and AI can be brought to bear in the massive scientific puzzle of cancer.
One thing is certain — we won’t have a shot at conquering cancer with these new methods if we don’t have more data to work with. Many data sets, including medical records, genetic tests and mammograms, for example, are locked up and out of reach of our best scientific minds and our best learning algorithms.
The good news is that big data’s role in cancer research is now at center stage, and a number of large-scale, government-led sequencing initiatives are moving forward. Those include the U.S. Department of Veteran Affairs’ Million Veteran Program; the 100,000 Genomes Project in the U.K.; and the NIH’s The Cancer Genome Atlas, which holds data from more than 11,000 patients and is open to researchers everywhere to analyze via the cloud. According to a recent study, as many as 2 billion human genomes could be sequenced by 2025.
There are other trends driving demand for fresh data, including genetic testing. In 2007, sequencing one person’s genome cost $10 million. Today you can get this done for less than $1,000. In other words, for every person sequenced 10 years ago, we can now do 10,000. The implications are big: Discovering that you have a mutation linked to higher risk of certain types of cancer can sometimes be a life-saving bit of information. And as costs approach mass affordability, research efforts approach massive potential scale.
A central challenge for researchers (and society) is that current data sets lack both volume and ethnic diversity. In addition, researchers often face restrictive legal terms and reluctant sharing partnerships. Even when organizations share genomic data sets, the agreements are typically between individual institutions for individual data sets. While there are larger clearinghouses and databases operating today that have done great work, we need more work on standardized terms and platforms to accelerate access.
The potential benefits of these new technologies go beyond identifying risk and screening. Advances in machine learning can help accelerate cancer drug development and therapy selections, enabling doctors to match patients with clinical trials, and improving their abilities to provide custom treatment plans for cancer patients (Herceptin, one of the earliest examples, remains one of the best).
We believe three things need to happen to make data more available for use for cancer research and AI programs. First, patients should be able to contribute data easily. This includes medical records, radiology images and genetic testing. Laboratory companies and medical centers should adopt a common consent form to make it easy and legal for data sharing to occur. Second, more funding is needed for researchers working at the intersection of AI, data science and cancer. Just as the Chan Zuckerberg Foundation is funding new tool development for medicine, new AI techniques need to be funded for medical applications. Third, new data sets should be generated, focused on people of all ethnicities. We need to make sure that advances in cancer research are accessible to all.

Contact us Today!

Chat with an expert about your business’s technology needs.