intro to cloud computing - managed solution

Introduction to cloud computing and Microsoft Azure

Cloud computing overview

Cloud computing provides a modern alternative to the traditional on-premises datacenter. Public cloud vendors provide and manage all computing infrastructure and the underlying management software. These vendors provide a wide variety of cloud services. A cloud service in this case might be a virtual machine, a web server, or cloud-hosted database engine. As a cloud provider customer, you lease these cloud services on an as-needed basis. In doing so, you convert the capital expense of hardware maintenance into an operational expense. A cloud service also provides these benefits:
  •   Rapid deployment of large compute environments
  •   Rapid deallocation of systems that are no longer required
  •   Easy deployment of traditionally complex systems like load balancers
  •   Ability to provide flexible compute capacity or scale when needed
  •   More cost-effective computing environments
  •   Access from anywhere with a web-based portal or programmatic automation
  •   Cloud-based services to meet most compute and application needs
    With on-premises infrastructure, you have complete control over the hardware and software that is deployed. Historically, this has led to hardware procurement decisions that focus on scaling up. An example is purchasing a server with more cores to satisfy peak performance needs. Unfortunately, this infrastructure might be underutilized outside a demand window. With Azure, you can deploy only the infrastructure that you need, and adjust this up or down at any time. This leads to a focus on scaling out through the deployment of additional compute nodes to satisfy a performance need. Although this has consequences for the design of an appropriate software architecture, there is now ample proof that scaling out the commodity of cloud services is more cost-effective than scaling up through expensive hardware.
    Microsoft has deployed many Azure datacenters around the globe, with more planned. Additionally, Microsoft is increasing sovereign clouds in regions like China and Germany. Only the largest global enterprises can deploy datacenters in this manner, so using Azure makes it easy for enterprises of any size to deploy their services close to their customers.
    For small businesses, Azure allows for a low-cost entry point, with the ability to scale rapidly as demand for compute increases. This prevents a large up-front capital investment in infrastructure, and it provides the flexibility to architect and re-architect systems as needed. The use of cloud computing fits well with the scale-fast and fail-fast model of startup growth.

Types of cloud computing

Cloud computing is usually classified into three categories: SaaS, PaaS, and IaaS.

SaaS: Software as a service

SaaS is software that is centrally hosted and managed. It’s usually based on a multitenant architecture— a single version of the application is used for all customers. It can be scaled out to multiple instances to ensure the best performance in all locations. SaaS software typically is licensed through a monthly or annual subscription.
Microsoft Office 365 is a prototypical model of a SaaS offering. Subscribers pay a monthly or annual subscription fee, and they get Microsoft Exchange as a service (online and/or desktop Microsoft Outlook), storage as a service (Microsoft OneDrive), and the rest of the Microsoft Office suite (online, the desktop version, or both). Subscribers always get the most recent version. So you can have an Exchange server without having to purchase a server and install and support Exchange—the Exchange server is managed for you. Compared to installing and upgrading Office every year, this is much less expensive and requires much less effort to keep updated.

PaaS: Platform as a service

With PaaS, you deploy your application into an application-hosting environment that the cloud service vendor provides. The developer provides the application, and the PaaS vendor provides the ability to deploy and run it. This frees developers from infrastructure management so they can focus on development.
Azure provides several PaaS compute offerings, including the Web Apps feature of Azure App Service and Azure Cloud Services (web and worker roles). In either case, developers have multiple ways to deploy their application without knowing anything about the nuts and bolts that support it. Developers don’t have to create virtual machines (VMs), use Remote Desktop Protocol (RDP) to sign in to each one, or install the application. They just hit a button (or close to it), and the tools provided by Microsoft provision the VMs and then deploy and install the application on them.

IaaS: Infrastructure as a service

An IaaS cloud vendor runs and manages all physical compute resources and the required software to enable computer virtualization. A customer of this service deploys virtual machines in these hosted datacenters. Although the virtual machines are located in an offsite datacenter, the IaaS consumer has control over the configuration and management of them.
Azure includes several IaaS solutions, including Azure Virtual Machines, virtual machine scale sets, and related networking infrastructure. Azure Virtual Machines is a popular choice for initially migrating services to Azure because it enables a “lift and shift” migration model. You can configure a VM like the infrastructure currently running your services in your datacenter, and then migrate your software to the new VM. You might need to make configuration updates, such as URLs to other services or storage, but you can migrate many applications in this way.
Virtual machine scale sets are built on top of Azure Virtual Machines and provide an easy way to deploy clusters of identical VMs. Virtual machine scale sets also support autoscaling so that new VMs can be deployed automatically when required. This makes virtual machine scale sets an ideal platform to host higher-level microservice compute clusters, such as Azure Service Fabric and Azure Container Service.

Azure services

Azure offers many services in its cloud computing platform. These services include the following.
Compute services
Services for hosting and running application workload:
  •   Azure Virtual Machines—both Linux and Windows
  •   App Services (Web Apps, Mobile Apps, Logic Apps, API Apps, and Function Apps)
  •   Azure Batch (for large-scale parallel and batch compute jobs)
  •   Azure RemoteApp
  •   Azure Service Fabric
  •   Azure Container Service
    Data services
    Services for storing and managing data:
  •   Azure Storage (comprises the Azure Blob, Queue, Table, and File services)
  •   Azure SQL Database
  •   Azure DocumentDB
  •   Microsoft Azure StorSimple
  •   Azure Redis Cache Application services
    Services for building and operating applications:
  •   Azure Active Directory (Azure AD)
  •   Azure Service Bus for connecting distributed systems
  •   Azure HDInsight for processing big data
  •   Azure Scheduler
  •   Azure Media Services
    Network services
    Services for networking both within Azure and between Azure and on-premises datacenters:
  •   Azure Virtual Network
  •   Azure ExpressRoute
  •   Azure-provided DNS
  •   Azure Traffic Manager
  •   Azure Content Delivery Network

New Windows 10 upgrade benefits for Windows Cloud Subscriptions in CSP

By Nic Fillingham as written on blogs.windows.com

We’re excited to announce that customers with Windows subscriptions via the Cloud Solution Provider (CSP) program can now upgrade their Windows 7 and Windows 8.1 PCs and devices to Windows 10 at no additional cost.
This means customers subscribed to Windows 10 Enterprise E3 and E5 as well as Secure Productive Enterprise E3 and E5, can now upgrade their Windows 7 and Windows 8.1 PCs and devices to Windows 10 without the need to purchase separate upgrade licenses.
This is an important benefit addition to Windows cloud subscriptions in CSP as it enables customers who have yet to purchase a new Windows 10 device, or who missed out on the free upgrade to Windows 10 campaign, to take advantage of enterprise-grade security, managed by a trusted partner, for the price of coffee and a donut.
In order to take advantage of this new upgrade benefit, tenant admins for customers with Windows cloud subscriptions can log in to the Office 365 Admin center http://portal.office.com with their Azure Active Directory admin credentials and see options to begin the upgrade on the device they are currently using, share the download link with others in their organization, create installation media or troubleshoot installation.
The Windows 10 upgrade licenses issued as part of this process are perpetual and associated with the device. This means the license will not expire or be revoked if the customer chooses to end their Windows cloud subscription in the CSP program.

Admin-center-screenshot-01-1024x407

The new upgrade benefits are rolling out now and tenant admins with Windows subscriptions in CSP should start to see Windows 10 upgrade options and links in their Office 365 Admin center over the next 48 hours.
We hope these new Windows 10 upgrade benefits will better enable businesses of any size – including those with PCs and devices still on Windows 7 and Windows 8.1 – to work with a trusted partner to upgrade to enterprise-grade security and management with flexible, small business pricing from just $7 per user, per month.

save-79-percent-by-hosting-in-the-cloud-managed-solution

Tech Sector Nonprofit Saves 79 Percent, Gains Global Market Access with Cloud Hosting

As written on customers.microsoft.com
Pro Bono Net provides web-based technology services that support law firms, courts, legal aid, and individuals throughout the United States. So when some of its own, on-premises servers reached end of life, what technology did the organization choose to replace them? Windows Azure. The nonprofit reduced annual cost after payback by 79 percent, made its service faster and more reliable, and has access to a global marketplace that was previously out of its reach.

Business Needs

Companies of all sizes are turning increasingly from on-premises IT infrastructures to cloud-based services for obvious reasons: they cost less and make it possible for companies to focus on their core strengths, rather than on commodity IT maintenance. But what do the providers of those services do when they face the same choice as their customers—and should those customers care?

A case in point is Pro Bono Net, a national nonprofit organization dedicated to increasing access to justice through innovative uses of technology and increased volunteer lawyer participation. The organization meets this mission, among other ways, through its Pro Bono Manager™ service, which boosts a law firm’s pro bono program management capacity. Operating as a secure, seamless extension of a law firm’s intranet, Pro Bono Manager integrates content from the public-interest legal community with reporting, knowledge management, and lawyer-and-case matching tools that draw on a firm’s own human resources and time keeping systems.

Pro Bono Manager is a web-based, or software-as-a-service, solution—and the low-cost and minimal management required by the law firms that adopt it has been one of its selling points. But the cloud that hosted the service was a very physical set of servers owned and managed by Pro Bono Net. When those servers reached end-of-life, Pro Bono Net faced the same choice that their customers had answered by choosing Pro Bono Manager: Should Pro Bono Net refresh its hardware installation, or migrate Pro Bono Manager to a cloud platform?

The organization had to consider the economics of its choices, as any enterprise would. But, as a service provider to others, it had additional considerations: Would a move to the cloud affect the prices, availability, reliability, and speed that Pro Bono Net offered its customers and, if so, how?

Solution

Pro Bono Net already had experience with the cloud; some of its other solutions ran on Amazon Web Services. But when it came time to migrate Pro Bono Manager, the organization chose Windows Azure, the Microsoft cloud computing platform.

One reason: Windows Azure was built from the ground up to support the same Microsoft technologies—Microsoft SharePoint Server, Microsoft SQL Server Reporting Services (in the cloud: Windows Azure SQL Reporting), and the Microsoft .NET Framework—that Pro Bono Net already used. Another reason: Microsoft offered Windows Azure Virtual Machines, which provided the flexibility and availability that comes from the use of virtualization technology.

Pro Bono Net used Windows Azure Virtual Machines for persistent virtualization in support of SharePoint Server, which serves as the foundation for Pro Bono Manager. If the organization had been moving between more consistent platforms—say, two virtual platforms, one managed on-premises and one in the cloud—it would have been easier to estimate cost. Going from a physical/on-premises platform to a virtual/cloud platform required some experimentation in preproduction environments, which the organization and Microsoft completed successfully.

Pro Bono Net eventually decided on a high-availability infrastructure that replicated domain controllers, front ends, application servers, and Windows Azure SQL Database instances on virtual machines. It also adopted Windows Azure availability sets to further mitigate risk and promote reliability. And as its use of Windows Azure grows, the organization expects to adopt geo-colocation features that will further increase fault tolerance and business continuity.

Benefits

By using Windows Azure, Pro Bono Net gains lower cost, greater reliability, faster performance, and new business opportunities. The organization plans to move its Amazon-based sites to the Microsoft cloud platform, too.

Avoids 79 Percent Cost of On-Premises Solution

Cost was a key factor for Pro Bono Net in deciding between an on-premises and cloud-based platform for Pro Bono Manager. By choosing Windows Azure, the organization avoided a US$25,000 investment in production hardware and services, plus $8,300 in maintenance and system administration. It also avoids another $25,000 investment to replicate the environment for the sake of business continuity.

For its specific configuration on Windows Azure, Pro Bono Net spends $11,000 annually—and saves 79 percent over comparable cost for an on-premises infrastructure and support, after a 1.4-year payback period.

Uptime Rises to 3 “9s,” Users See 20 Percent Faster Loads

Pro Bono Net now pays less to support Pro Bono Manager while gaining more, particularly more reliability. Since the move to Windows
Azure, uptime for the application has increased from 99 percent to 99.9 percent. “That’s a significant increase for us,” says Alec Rosin, Consulting Engineer for Pro Bono Net. “On-premises, if we had a disaster, we could be out for a week. We don’t anticipate that happening on Windows Azure.”
Pages and reports now load about 20 percent faster on Windows Azure, creating a more natural user experience.

Gives National Organization the Tools to Go Global

Pro Bono Net expected lower cost and better service from Windows Azure. What it didn’t expect was new business opportunities—but it now has them, too. Many countries or regions require that sensitive data, including legal data, remain within their borders. Pro Bono Net, with its US-based data center, couldn’t go after this business before.
Now, using Windows Azure’s global data centers and Content Delivery Network, it can. “We can go from being a national service organization to a global service organization, by using Windows Azure,” says Adam Licht, Director of Product Management at Pro Bono Net.

[vc_row][vc_column][vc_column_text]

Public vs private? Hybrid gives the biggest gains

By James Staten as written on azure.microsoft.com
OpenStack_HybridWant to hear about the best examples of enterprise computing today? At Open Stack Silicon Valley, August 9-10 you will hear about some of the most innovative implementations in the market and what you will find is that the majority are not purely in the public cloud, nor are they solely in the data center. Enterprises today are getting the most benefit when they take advantage of both, and use the hybrid model of cloud computing.
Why? Because using this “blended” model lets you stop worrying about where your apps are, and focus more on how to leverage the right resources for the right value that helps you deliver innovation and create new value streams for your business. According to IDC, eighty-two percent of you already have a hybrid cloud strategy -- but are you utilizing the mix of resources for maximum gains?
Despite what other cloud market leaders might tell you, hybrid is not a temporary state but the new normal and the normal for decades to come. That’s because compute and data are everywhere and being generated everywhere. And if you want to deliver greater business value you need to embrace and leverage this breadth. In my session at Open Stack Silicon Valley, on August 10 at 10:00 am PDT, I’ll take you through why leading organizations are thinking hybrid now and for the future. The catalyst for this thinking isn’t infrastructure ownership, control or security. It’s what’s right for the apps.
Want global reach, massive elastic scale, and a wealth of innovative compute and data services? Use the public cloud: that’s what it was designed for!
Perhaps you have a factory automation solution, point-of-sale system, or a legacy application that you’d just as soon keep in your data center for now. If there’s a good reason not to move to the cloud, you shouldn’t – but think about connecting those apps to cloud-based, agile and highly scalable business workflows, mobile customer experiences, IoT initiatives and other innovations that help drive your business values forward.
For example, Microsoft and GE recently announced a partnership where GE’s vast base of industrial computing capabilities will connect to its Predix analytics service running in Azure. “Connecting industrial machines to the internet through the cloud is a huge step toward simplifying business processes and reimagining how work gets done,” said Jeff Immelt, CEO of GE. You can’t move the industrial machines to the cloud, obviously, but offloading all the analytics from the data center to the cloud makes great technical and economic sense.
Want to connect on-premises and cloud apps securely so that you have one consistently managed “virtual” data center?
With Azure virtual networking, or Express Route, our dedicated-line solution, you can connect securely and get the bandwidth you need. With Microsoft’s Operations Management Suite, you can have a “single pane of glass” for managing workloads across OpenStack, public Azure, AWS and other deployments.
Here’s what Michael Alatortsev, chief executive officer at iTrend, a provider of data discovery services, said about their hybrid application: “A Microsoft hybrid cloud solution with … Azure enabled us to make more flexible design decisions.” He added, “We can adapt our solution to address different market verticals and situations, and creating a new type of report is very easy.”
Today, using Infrastructure-as-a-Service, you can easily migrate VM’s from your data center to the cloud. With SQL Server, you can keep your database local, shift to a blended deployment using SQL Server 2016, or trivially move it to Azure – as mobile application provider App Dynamic did: “The transition of [the on-premises] database to … Azure SQL Database only took a few hours,” said Pratik Kumar, CEO and founder of App Dynamic.
Microsoft offers the most comprehensive suite of offerings for the hybrid cloud, from networking and directory services to application services like SQL Server and big data, backup and recovery, and our Office and Dynamics offerings, all of which can run either on-premises or in the cloud, and can easily connect to other applications running in either location. No other vendor provides such a rich hybrid portfolio, and why? Because we’ve been providing enterprise-grade technologies to our customers for years and we understand your needs.
At the end of the day, it’s not about where your computing assets are, it’s how they provide value to your business! Think of the cloud as a new opportunity – not an obligation – for you to drive breakthrough levels of value from your computing assets, and design your hybrid cloud for the maximum business value.
Have questions about this premise? Want to hear more about how to implement this strategy at your organization? Join me at Open Stack Silicon Valley on August 10. Already executing on a successful hybrid app strategy? Tell us about it in the comments section below. Let’s get you on stage at next year’s show.

[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]

Microsoft’s Excel API, which lets developers access data stored in spreadsheets, hits general availability

By Frederic Lardinois as written on techcrunch.com
After a relatively short beta, Microsoft today announced that the Excel API — a way for developers to programmatically use Excel for Office 365 for doing calculations, building dashboards and more — is now generally available.
Microsoft first announced the API last November and then detailed its plans for turning Office 365 into more of a platform for developers at it Build conference in March. Like all of Microsoft’s Office 365 APIs, the new Excel one will be available through the Microsoft Graph, the company’s unified API endpoint for all of its cloud services.
Implicitly, the Excel API acknowledges that most businesses use (and abuse) Excel to store lots of data. Using the service, developers will be able to perform calculations based on this data — and data from their apps outside of Office 365. The can also call on data and calculations from Excel sheets for building reports and dashboards.
Microsoft has partnered with two third-party services to make accessing this API easier. Zapier users, for example, can now build integrations with Excel for Office 365, and Sage, which offers business management service for small and medium businesses, is integrating it with its accounting solution.

[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column width="1/4"][vc_column_text][vc_single_image image="10968" img_size="full"][/vc_column_text][/vc_column][vc_column width="3/4"][vc_column_text]

By Alice Rison as written on azure.microsoft.com
We are happy to announce Microsoft Azure obtained the ISO/IEC 27017:2015 certification, an international standard that aligns with and complements the ISO/IEC 27002:2013 with an emphasis on cloud-specific threats and risks.
This certification provides guidance on 37 controls in ISO/IEC 27002 and features seven new controls not addressed in ISO/IEC 27002. Both cloud service providers and cloud service customers can leverage this guidance to effectively design and implement cloud computing information security controls. Customers can download the ISO/IEC 27017 certificate which demonstrates Microsoft’s continuous commitment to providing a secure and compliant cloud environment for our customers.
Microsoft Azure helps customers meet their compliance requirements across a broad range of regulated industries and markets including financial services, healthcare, life sciences, media and entertainment, worldwide public sector, and US federal, state and local government.

[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]

The SwimTrain exergame makes swim workouts fun again

By Miran Lee as written on blogs.msdn.microsoft.com

To many who swim for exercise, workouts come down to the monotony of doing laps—swimming back and forth in a pool. Over and over. Unlike other exercisers, who can make their routines less of a chore by adding a social component—working out with friends, family, or in groups—swimmers really haven’t had many options, because coordinating a group of swimmers is difficult. The Korea Advanced Institute of Science and Technology (KAIST) and Microsoft Research Asia (MSRA) are happy to report that with SwimTrain, their new cooperative “exergame” research project, you’ll never have to swim alone again.
SwimTrain is the result of a research collaboration between KAIST and MSRA. The project targets something we can all relate to: exercise boredom. Swimming, while one of the best ways to get fit, can be tedious. The SwimTrain team thinks they have a way to make swimming a lot more exciting.

SwimTrain

How does SwimTrain work? First, you slip your phone into a waterproof case and plug in some waterproof headphones. Then, you jump in. Players get matched up as a team to form a virtual “train,” with each player controlling the speed of a single train compartment. Go too fast or too slow, and the game warns you of bumping into other compartments. Featuring narration, vibration feedback, spatialized sound effects, and background music, the immersive experience takes players through different modes of gameplay based on an interval training workout plan.
Each SwimTrain round consists of three phases:
Phase 1: Compartment ordering
Compartments race against other compartments. A compartment is ranked based on a swimmer’s average stroke speed during the race.
Phase 2: Train running
Compartments are placed along the same track and run in a circle (like a merry-go-round). To earn points, each swimmer must maintain their current stroke rate with the target stroke rate established in the previous phase. A compartment shifts with the movement of the current stroke rate relative to the target stroke rate, and it should travel without crashing into adjacent compartments.
Phase 3: Train stop
The virtual train stops. Every swimmer takes a short rest. The game narrates the final ranking of the current round and information for the next round, such as the duration of each phase and recommended stroke types.
SwimTrain accomplishes immersive gameplay by relying on advanced tech packed into a mobile phone. The barometer, accelerometer, gyroscope, and magnetometer track swimming activities, determining swimming periods, stroke, style, speed, and other events. This information is fed to a Network Manager based on the Microsoft Azure cloud, and is then delivered back to the game as rank and round data, determining the status of the player in relation to the train. It’s also passed to a Feedback Manager, which provides the auditory and sensory feedback that make SwimTrain unique.
Preliminary feedback from users is positive—SwimTrain makes you feel like you’re not alone in the pool. According to one test user, “Although [SwimTrain] didn’t provide any visual feedback, I felt like I was swimming with others.” Feedback is also indicating that SwimTrain is providing an immersive and enjoyable experience that’s intense workout, too.

The project team’s research is getting noticed in the world of human-computer interaction (HCI). CHI 2016, the world’s top conference for HCI, has accepted the team’s research for inclusion in the CHI 2016 Notes and Papers Program.
This collaboration with KAIST is a great example of how Microsoft values symbiotic relationships with partners in academia. “Not only do we have the ability to shape the future of Microsoft products, we have the chance to support and learn from some of the top professors in computer science,” said Darren Edge, lead researcher at MSRA. Many of these collaborations lead to internships. “When a student makes a particularly promising contribution to a joint project, we can also invite them to spend time at Microsoft as a research intern. Everybody wins from such internships: we get some of the brightest PhD students to work on our projects, and the students develop new expertise and skills that they can apply to their university work with their professor.”
Darren explains that this recently happened as a result of his ongoing collaboration with Professor Uichin Lee at KAIST. Following the completion of work on SwimTrain, Professor Lee’s PhD student Jeungmin Oh joined Darren at MSRA for a six-month internship, working in another area. “We are all now collaborating on multiple projects in parallel. If any of them are as successful as SwimTrain, which won the third place award at the recent Microsoft Korea and Japan Day and has two accepted papers pending publication, I will be very happy indeed,” he states.
The MSRA HCI group has in fact had a longstanding collaboration with academia: In recent years, MSRA has supported principal investigators for projects published at CHI 2014, CSCW 2015, and CHI 2016.
In the future, SwimTrain will focus on measuring more data, such as heart rate and maximal oxygen uptake, to determine the exertion level of a player’s swimming. Also, the method might be applied to other group exercises, such as group jogging and group cycling. We look forward with anticipation to what SwimTrain might inspire.

[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]

Aiming to Deliver New Drugs Faster at Less Cost in the Cloud

cells1

Researchers from Molplex, a small drug discovery company; Newcastle University; and Microsoft Research Connections are working together to help scientists around the world deliver new medicines more quickly and at lower cost. This partnership has helped Molplex develop Clouds Against Disease, an offering of high-quality drug discovery services based on a new molecular discovery platform that draws its power from cloud computing with Windows Azure.
Rethinking Drug Discovery
David Leahy, co-founder and chief executive officer of Molplex, envisions a way to help pharmaceutical researchers anywhere in the world form effective drug discovery teams without large investments in technology or fixed running costs. "It takes massive computing resources to search through chemical and biological databases looking for new drug candidates. Our Clouds Against Disease solution dramatically reduces the time and cost of doing that by providing computation and chemical analysis services on demand," Leahy says.
Molplex regards drug discovery as a big data and search optimization problem. Clouds Against Disease uses its computational molecular discovery platform to automate decision making that is traditionally the scientists’ task.
"Instead of having teams of scientists scanning chemical information, our software searches for structures that have multiple properties matching the search criteria," explains Leahy. "When we integrate that with highly automated chemical synthesis and screening, it becomes a much more efficient and productive way of doing drug discovery."
Data Manipulation on a Larger Scale
In a recent pre-clinical study, the company applied its computational platform to more than 10,000 chemical structure and biological activity data sets. This action generated 750,000 predictive relationships between chemical structure and biological effect. After generating numerous possible outcomes, Molplex then used the same validation criteria that scientists would use to narrow down the 750,000 relationships to just 23,000 models covering 1,000 biological and physico-chemical properties, a relatively small data set that humans could then manage. "It would have taken hundreds of scientists several years to do this the conventional way," Leahy
Windows Azure was critical to the success of Clouds Against Disease. Molplex can access 100 or more Windows Azure nodes—in effect, virtual servers—to process data rapidly. The physical-world alternative would be to source, purchase, provision, and then manage 100 physical servers, which represents a significant investment in up-front costs. Before they could begin drug research, scientists taking this traditional approach would have to raise millions of dollars, but Windows Azure helps eliminate start-up costs by allowing new companies to pay for only what they use in computing resources.
Vladimir J. Sykora, co-founder and chief operating officer for Molplex, explains that the Molplex computational platform runs algorithms his company developed to calculate the numerical properties of molecules rapidly. Consequently, Molplex has been able to produce drug discovery results on a much larger scale than what was previously feasible. "We would not have been able to predict so many compounds without the cloud computing resources enabled by Windows Azure," asserts Sykora. "The speed and high level of detail provided by Windows Azure allow us to explore far beyond what would have been possible with traditional hardware resources."
Fighting Tropical Diseases
Molplex is embarking on a new collaboration with the Malaysian government to search for drugs that fight tropical diseases. This search has always been a lower priority for drug companies because the market is smaller, making it a less desirable commercial prospect. The traditional drug discovery program is geared to $1 billion a year blockbuster drugs; however, there are fewer opportunities today for drugs with that level of commercial potential.
Increasingly, scientists are researching tropical diseases that affect smaller populations; radically reducing the cost of drug discovery makes it feasible for scientists to tackle them. "Unlocking drug discovery technology from a physical location with the cloud has tremendous potential to help researchers work on curing these diseases faster and at less cost," asserts Leahy, "wherever they are in the world."

[/vc_column_text][/vc_column][/vc_row]

Contact us Today!

Chat with an expert about your business’s technology needs.