[vc_row][vc_column][vc_column_text]

Learn how to detect potential vulnerabilities affecting your organization's identities. See how you can investigate suspicious incidents and configure auto responses. Understand how you can have a consolidated view of administrators, enable on-demand administrator access to Microsoft Online Services, get reports and set alerts about access to a privileged role.

[/vc_column_text][/vc_column][vc_column][vc_column_text][vc_button2 title="Watch Webinar On-Demand" color="carrot" size="lg" link="url:http%3A%2F%2Fevent.on24.com%2FeventRegistration%2Fconsole%2FEventConsoleApollo.jsp%3F%26eventid%3D1465491%26sessionid%3D1%26username%3D%26partnerref%3D%26format%3Dfhaudio%26mobile%3Dfalse%26flashsupportedmobiledevice%3Dfalse%26helpcenter%3Dfalse%26key%3D3A70BA2DFE6CCF0AD4685E38BE880CD8%26text_language_id%3Den%26playerwidth%3D1000%26playerheight%3D650%26overwritelobby%3Dy%26eventuserid%3D177275435%26contenttype%3DA%26mediametricsessionid%3D142331226%26mediametricid%3D2105423%26usercd%3D177275435%26mode%3Dlaunch||"][/vc_column_text][/vc_column][/vc_row][vc_row parallax="content-moving" css=".vc_custom_1465945819577{background-color: #e98922 !important;}"][vc_column width="1/2"][vc_column_text]
[/vc_column_text][/vc_column][vc_column width="1/2"][vc_column_text css_animation="appear"]

Achieve IT infrastructure cost savings of at least 50%

Call Southern California’s most trusted name in cloud at 800-208-3617 for real time pricing and a cost benefit analysis for Microsoft's Azure.

[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]As written on Microsoft Technet

In this blog post I will address some of the questions we’re receiving from customers regarding Azure Backup and Recovery Services vaults. We will focus on one of the most common Azure Backup / Recovery Services Vault management scenarios, which is a ‘vault swap’ for a Data Protection Manager (DPM) server or Azure Backup Server (MABS)*.

* Since DPM and Azure Backup Server have the same functionality in regards to a ‘vault swap’, we’ll simply use the term DPM to reference both products in the rest of this article.

Azure Vault Basics

Let’s start with some baseline information regarding Azure Backup and Recovery Services vaults. First is the relationship of DPM server to vault. Options include:

One to One relationship (DPM1 : Vault1

  • In this case each DPM server is registered to its own vault. This works fine but it can result in many different vaults to manage if you have multiple DPM servers and it could cause you to reach the limit of 25 vaults per subscription as mentioned in the Azure Backup FAQ.Many to One relationship (DPM1,DPM2,DPM3 : Vault1)
  • For this option, multiple DPM servers are registered to the same vault. This is the most common scenario and is limited only by the fairly substantial number of 50 servers per vault, which is also mentioned in the FAQ.
  • One to Many relationship (DPM1 : Vault1,Vault2)

For this option, a single DPM server may be registered to multiple vaults. At one time or another, for various business reasons, a DPM server may have written to multiple vaults and it’s likely those old Recovery Points need to be retained. As a result, the DPM server preserves its registration to multiple vaults. Note that only one vault at a time can be actively registered to a DPM server for reading or writing data. In other words, a DPM server cannot have one active vault for backups and another active vault for restores. There is a supported method to actively write to one vault and restore from another but it requires a second DPM server:

NOTE: If a DPM server (DPMServer1) has an active registration to one vault for backups (NewVault) but the backup admin needs to restore data from a former vault (OldVault), then use the Add external vault process from DPMServer2. Otherwise, re-registering OldVault as the active registered vault on DPMServer1 could allow DPM to write new backups into OldVault, which likely is not desirable – why else would the backup admin have changed from OldVault to NewVault?

If you do switch back and forth between vaults – perhaps one vault is activated for writing new data while another is temporarily activated for an emergency restore – then it may be difficult to track which vault is active. You can find the active vault name at the following location in the DPM server’s registry: HKEY_LOCAL_MACHINESOFTWAREMicrosoftWindows Azure BackupConfigServiceResourceName.

Azure Vault Swap

There are a couple reasons why customers may be changing their DPM server from one Azure Vault to another:

New Azure Data Center offerings make better business sense for some customers
In February 2017, the Azure Recovery Service was added to several new Azure Data Centers, including the Canada and UK paired Data Centers, as well as US West 2. US West Central was added a couple weeks prior in January 2017.
With these new service locations available, we’re seeing some customers change the location of their Recovery Services vaults (Classic vaults do not apply to these locations). The change could be due to reduced network latency if one of these new data centers is closer to the customer’s on-premises location. Or the change could be due to geo-political requirements, such as a Canada or UK customer that formerly saved backup data out-of-country simply due to there being no in-country offerings.

Various other business reasons
This really could be anything. For whatever reason, let’s say a customer wants to place 2016 data in a vault called ‘2016’ and 2017 data in a vault called ‘2017’. Or maybe they started writing data to a vault that was configured for GRS replication and later decided to change to a vault configured for LRS. I’m sure there are other possibilities here as well.

Pre-requisite Details

For full transparency, let’s first address Azure Billing details because there are scenarios where this could increase your costs for Azure Backup. A ‘vault swap’ can be performed with you keeping the data in the old vault so you have access to all your old Recovery points. Or you may decide you don’t need the historical data and therefore you simply delete the old vault. For those cases where the old vault and data are retained, you would be paying for the vault costs (instances and storage costs) to maintain the Recovery Points active in two vaults since the data would be saved in two separate vault locations (and we’re not talking about GRS here)[AG1] . Saving two copies of your backup data may not be an issue if the data has a short retention: for example 2-4 weeks. Increasing your Azure Backup costs for 2-4 weeks might not be cause for alarm. Along the same lines, a small amount of data would also likely make this cost increase acceptable.

However, customers with longer (such as yearly) retention or a large amount of data may decide that the cost is not worth going through this process. Or they may decide to delete the old data – assuming they have no SOX requirements, of course.

Process to change a DPM Server’s Vault

Changing a DPM server’s registered vault is a fairly easy process, the same as registering a new vault which is documented here. Since the topic of this article is switching vaults, step #3 (installing the Azure Backup Agent) is most likely already done. If you’re switching to an older vault, then step #1 (create the vault) is obviously already completed as well. This would simply leave steps #2 & #4.

    1. Create a Recovery Services vault — Create a vault in Azure portal.
    2. Download vault credentials — Download the credentials which you use to register the DPM server to Recovery Services vault.
    3. Install the Azure Backup Agent — From Azure Backup, install the agent on each DPM server.
    4. Register the server — Register the DPM server to Recovery Services vault.

After changing a DPM server’s active registration to a new vault, the DPM UI won’t update automatically so you’ll need to close and re-open the UI. Next, the DPM’s internal ‘online policy’ doesn’t automatically get refreshed so if no further action is taken, you may see online jobs fail with this error:

Type: Online recovery point
Status: Failed
Description: An internal error prevented the modification of the backup policy. (ID 100010)

This error can be resolved by updating the online policy for each Protection Group. To update the online policy, choose to Modify the Protection Group. Click through the wizard steps to get to the “Specify Online Retention Policy” screen. Increment one of the retention values, then click through the remaining wizard screens, and finally update the Protection Group. Making this change to the online properties will refresh the local policy and resolve the error listed previously. However, to keep your previous retention values, you’ll need to go through the same steps to Modify the Protection Group and change the value back to the original.

For example, in the screenshot below, I would first change the daily retention value from 3 days to 4 days and then continue through the wizard and update the Protection Group. Next, I would go through the wizard a second time and change the daily retention value back to 3 days. These steps can be automated using Powershell if you have multiple Protection Groups to update.

Now that the DPM server’s vault has been changed and the online policy refreshed, your new backups will seamlessly be written to the newly registered vault

Details for the Azure Backup Agent

The Azure Backup agent (MARS), which supports file & folder backups directly to Azure, has its own steps for vault registration but I did find one item of warning while attempting a vault swap. When you perform the vault swap, the online policy seems to contain the list of selected folders. Therefore, after registering to the new vault, my list of selected folders was gone. Re-registering to the old vault brought them back but if you are planning to change to a new vault while using the MARS agent, make a note or take a screenshot of your selected folders so you can re-configure the backups just as they were before the vault changes were made.

Scott Gehrke, Microsoft Support[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_cta_button2 h2="Find Your Best Path To A Truly Consistent Hybrid Cloud" title="COST BENEFIT ANALYSIS" size="lg" position="bottom" link="url:http%3A%2F%2Fwww.managedsolution.com%2Faws-azure-compare%2F||" accent_color="#f4c61f"]

Achieve IT infrastructure cost savings of at least 50%

Call Southern California’s most trusted name in cloud at 800-208-3617 for real time pricing and a cost benefit analysis for Microsoft’s Azure and Amazon’s AWS.

[/vc_cta_button2][/vc_column][/vc_row]

[vc_row][vc_column width="1/2"][vc_column_text]

Cloud Wars: Microsoft vs. Amazon Web Services

Why Microsoft Azure is the #1 Cloud Vendor

While Amazon Web Services has typically been a strong contender for cloud solutions, Microsoft is walking away with the title of number one cloud vendor (Source: Bob Evans on forbes.com). With highlighted features of scalability, innovation, and complete cloud capabilities, Microsoft is a clear winner in the battle between AWS and Microsoft Azure.
Download the infographic below to see how we break down how Microsoft Azure shines against AWS in all things cloud.


[/vc_column_text][/vc_column][vc_column width="1/2"][vc_single_image image="17495" img_size="large" alignment="center"][/vc_column] [/vc_row]

[vc_row]
[vc_cta_button2 h2="" title="Cloud Comparison Calculator" size="lg" position="bottom" accent_color="#dd9933" link="url:http%3A%2F%2Fwww.managedsolution.com%2Fcloudtco%2F||"]

Managed Solution is in the top 1% of Microsoft Cloud Service Providers worldwide, and a premier partner aligned with Microsoft’s mission to empower every person and every organization on the planet to achieve more.

Download our Cloud Comparison Calculator to receive access to the latest in cloud pricing aggregation, your all up cost of on premises vs. a cloud hosted solution

[/vc_cta_button2] [/vc_row]

[vc_row][vc_column][vc_column_text][vc_column_text]

As written on enterprise.microsoft.com

[/vc_column_text][vc_column_text]

It’s 7:00 a.m. on the first day of a new academic session, and the servers at Keiser University are running at full power as students, faculty, and staff ramp up for a new month of learning. Twelve hours from now, the peak will drop for the evening, and some of those servers will shut down, saving the university thousands of dollars. A few days from now as students settle into their new routines, activity will drop during the daytime, too, so the system can run efficiently on even fewer servers.
Fortunately for the IT department, the system reboots, cranks through the data, and keeps everyone running at full speed all on its own, leaving IT staff with more time to be creative. In fact, today, Associate Vice Chancellor of IT Andrew Lee and his team are focusing on a paperless financial aid system. The ability to test new applications without the upfront capital needed for a traditional on-premises environment allows Keiser to stay on the leading edge of technology while saving the university precious time and funds.

[/vc_column_text][vc_column_text]

As Associate Vice Chancellor of IT at Keiser University, Andrew Lee handles everything that has to do with digital technology, ensuring that teachers and students have the tools and tech they need every day. When he joined Keiser 18 years ago, the IT department consisted of just two people serving five schools and 1,500 students.
Today, the university has 31 locations and close to 20,000 students. Andrew’s goal is to keep the school as “state of the art” as possible as it grows. He constantly looks for new technology that will help him do that while staying within budget.

Migrating an entire datacenter to Azure

Eighteen years ago, the university’s data was stored in a physical datacenter where the school owned the hardware. After transitioning to a “sort of” cloud, as Andrew describes it, where the hardware was leased and some of the infrastructure was paid for, he began looking at a full cloud solution.
“Everything was on the chopping block, and moving to the cloud just made sense,” Andrew says.
Microsoft Azure offered scalability and the ability to change on the fly. That sparked a fire in Andrew and his IT team, and now they’re moving the entire datacenter into Azure. With the new pay-as-you-go model, they don’t need up-front capital, and they have exactly as much as they need at any given time.
“It used to be that if we needed more storage, we had to lay out more capital expense. In Azure, we just log in and those resources are up and running within hours.” It’s a welcome change from the days of a physical datacenter with AT&T hosting. “Back then, we paid $35 – $40k every month. In Azure, I’ll have resources and servers that outshine anything we had there, and pay $5-10k less a month.”

[/vc_column_text][vc_column_text]

Without the restrictions inherent in a traditional system, Andrew and his team can be much more creative without breaking the budget. New ideas and projects can be tested and deployed without the red tape, and that means faculty can dream up new ways to educate their students, and those dreams can turn into reality much quicker.
“We’ve effectively gone from an old jalopy to an Italian sports car,” Andrew says.
With 2 million personal records, ensuring security is critical. And when it comes to compliance, Azure offers the ability to back up as much data as necessary for as long as it’s needed.
“In Azure, compliance is a no-brainer, and when you need more storage, you simply add it,” Andrew says.
Beyond Azure, Keiser University has migrated to Office 365 and is beginning to use more of the tools at their disposal. Staff and students who prefer to use their own devices on campus can quickly and securely connect to the school’s systems with Microsoft Enterprise Mobility Suite (EMS), and servers are secured, updated and patched with Microsoft Intune. Phishing attacks are the most common cyberthreat on campus, but by leveraging Active Directory, the IT department can help make sure accounts stay secure.

[/vc_column_text][vc_column_text]

Empowering educators and administrators to focus on what matters

For now, faculty, staff, and students are in the learning curve phase. Everyone has access to Office 365, and many are using OneNote and OneDrive to share documents and collaborate. Andrew’s IT department is doing the heavy lifting, migrating servers, data, and email to the cloud. Next up will be training and demo days to show faculty and staff all the tools that are readily available.
For educational institutions looking into a cloud solution, Andrew advises that it boils down to where you want to put your resources.
“In a cloud scenario, I’m out of the hardware business. After 18 years in this position, the thing that’s always caused the problems is hardware,” Andrew says. “Controller cards, fans, CPUs. Those are the things that come crashing down. With Azure, they’re a non-issue. All that redundancy is already there.”

[/vc_column_text][vc_column_text]

The best part? “No phone calls at 8 a.m. on a Sunday.” Andrew and his IT team sleep better at night knowing that the hardware pieces that tend to bring the system down go away. And that means they’re able to focus on being more of a partner in delivering technology so that faculty, staff, and students can get exactly what they need to do their work quickly and efficiently.

What’s next?

Andrew’s vision for the university is one where faculty and staff can register their own devices and have quick access to the resources and software they need, safely and securely, without even involving the IT department, or needing their support. Which will free up the IT team to continue looking for ways to use technology to continue simplifying resources, delivering new solutions, and lowering costs.

[/vc_column_text][vc_column_text]

For Andrew, the light bulb moment was realizing that, with Azure, his team could stand up a demo environment and throw hardware at it, acknowledging the pitfalls and configuring it—all in one day–saving time and money in the deployment stage. Andrew fully optimizes Keiser University’s cloud investment, and he’s leveraging the fact that he can ride the wave of resource needs instead of keeping everything at 100% and waiting for ebbs and flows.
Microsoft is proud to partner with Andrew and Keiser University to keep up with the pace of change in their digital transformation journey.

[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]

By Beckylin Orooji as written on azure.microsoft.com
With the Application Insights JavaScript SDK you can collect and investigate the performance and usage of your web page or app. Historically we have offered onboarding through manually adding a script to your application and redeploying. Manually adding the script is still supported, but recently we have added the ability to add client-side monitoring from the Azure portal in a few clicks as well.

Enablement

If you have enabled Application Insights in Azure, you can add page view and user telemetry. You can learn how to switch on server-side monitoring in our documentation.
     1. Select Settings -> Application Settings

step one

     2. Under App Settings, add a new key value pair:

Key: APPINSIGHTS_JAVASCRIPT_ENABLED

Value: true

step two

 

     3. Save the settings and Restart your app in the Overview tab.
The Application Insights JavaScript SDK is now injected into each web page.

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text css=".vc_custom_1534360917373{background-color: #dd9933 !important;}"]

Want more info on cybersecurity? Contact us to learn more about keeping your data protected.

800-208-3617 

[/vc_column_text][vc_raw_js]JTNDJTIxLS0lNUJpZiUyMGx0ZSUyMElFJTIwOCU1RCUzRSUwQSUzQ3NjcmlwdCUyMGNoYXJzZXQlM0QlMjJ1dGYtOCUyMiUyMHR5cGUlM0QlMjJ0ZXh0JTJGamF2YXNjcmlwdCUyMiUyMHNyYyUzRCUyMiUyRiUyRmpzLmhzZm9ybXMubmV0JTJGZm9ybXMlMkZ2Mi1sZWdhY3kuanMlMjIlM0UlM0MlMkZzY3JpcHQlM0UlMEElM0MlMjElNUJlbmRpZiU1RC0tJTNFJTBBJTNDc2NyaXB0JTIwY2hhcnNldCUzRCUyMnV0Zi04JTIyJTIwdHlwZSUzRCUyMnRleHQlMkZqYXZhc2NyaXB0JTIyJTIwc3JjJTNEJTIyJTJGJTJGanMuaHNmb3Jtcy5uZXQlMkZmb3JtcyUyRnYyLmpzJTIyJTNFJTNDJTJGc2NyaXB0JTNFJTBBJTNDc2NyaXB0JTNFJTBBJTIwJTIwaGJzcHQuZm9ybXMuY3JlYXRlJTI4JTdCJTBBJTA5cG9ydGFsSWQlM0ElMjAlMjIzNzg1ODY5JTIyJTJDJTBBJTA5Zm9ybUlkJTNBJTIwJTIyYjNmZjVjYWYtOTQzMS00NDQ1LTkzNmQtYjc1NmZiZGExOWM1JTIyJTJDJTBBJTA5Y3NzJTNBJTIwJTIyJTIyJTBBJTdEJTI5JTNCJTBBJTNDJTJGc2NyaXB0JTNF[/vc_raw_js][/vc_column][/vc_row]

[vc_row][vc_column][vc_single_image image="15631" img_size="full" alignment="center"][vc_column_text]

Quorum reimagines the possibilities of oil and gas with Microsoft

As written on customers.microsoft.com
Quorum powers the oil and gas industry with cutting-edge software solutions built on a Microsoft-centric framework. Their software platform is used across every step of the energy cycle from well to burner. Quorum is leveraging their partnership with Microsoft to drive a new paradigm in the industry. With the full suite of Microsoft products including Azure, Surface Hubs, and Skype for Business, the company is able to stay at the forefront of innovation and deliver a seamless experience for all users.
Quorum has a history of innovation. For 20 years the company has been automating workflows and business processes for the oil and gas, renewable energy, and natural resources industries. Their software platform, built on a Microsoft-centric framework, has enabled them to successfully complete 1,500 deployments and projects for hundreds of customers.
Today, that software platform— designed to deliver both optimal efficiencies and maximized profits—boasts tens of thousands of users. Their solutions are used by all of the major energy companies across every step of the process, from well to burner.
“We’re about five to seven years ahead in terms of innovation and cloud enablement.” says Olivier Thierry, Quorum’s Chief Marketing Officer. With 17 of the top 20 E&P companies and 85% market share in midstream, the company is successfully transitioning current customers to its mobile-first myQuorum platform; migrating them to the cloud with cloud-enabled premium service offerings.

A Long-term Partnership

Quorum and Microsoft have a long history of working together. With the full suite of Microsoft products, Quorum stays at the forefront of product innovation to stay on top of their own digital transformation. Delivering insights through data, replacing a huge paper trail for greater efficiency, and providing a consumer-like experience appeals to a new generation of professionals and enables the company to deliver more innovation to its customers.
Watch the video and learn how Quorum uses the Microsoft technology stack to drive new user experiences.

The Hub of innovation and productivity

Now that they have enabled their customers to become more productive and mobile, Quorum wanted to help their own employees realize the same benefits. The ability to harness the power of technology to bring together geographically dispersed teams, share and collaborate on projects and documents, and stay up-to-speed on technology updates led them to Microsoft’s Surface hub. Because it’s so intuitive, user adoption is high and has had a profound impact on the team. Another plus? Quorum realizes significant savings with the Surface Hub versus traditional videoconferencing and content sharing solutions.
See how Quorum users interact and leverage Surface Hubs to deliver efficiency and collaboration.

New Opportunities Through Cutting-edge Technology

There’s little question that the oil and gas industry is changing. Long time employees are retiring, the cultural mindset and reliance on fossil fuels has evolved, and the economics of hydrocarbons are shifting. Taking advantage of the entire Microsoft technology stack—such as Microsoft Azure, SQL Server, Windows 10, Office 365, Surface devices, and Cortana Intelligence—their software is helping oil and gas companies navigate these changes more efficiently and effectively. Being ahead of the curve has Quorum prepared for when the IoT wave hits oil and gas.
With the help of Microsoft technologies, Quorum customers are reimagining the possibilities in the oil and gas industry and discovering previously unconsidered efficiencies. “There is so much we can do together to drive digital transformation to the oil and gas sector,” Theirry says of Quorum’s partnership with Microsoft. “And we are starting to lead that digital transformation.”

[/vc_column_text][/vc_column][/vc_row]

For more information, call us at 800-208-3617

[vc_row][vc_column][vc_column_text]

Introducing Modern Backup Storage with Azure Backup Server on Windows Server 2016

By Maanas Saran as written on azure.microsoft.com
One of the key features that was announced with the latest release of Azure Backup Server is Modern Backup Storage. Modern Backup Storage is a technology that leverages Windows Server 2016 native capabilities such as ReFS block cloning, deduplication and workload aware storage to optimize backup storage and time, and delivers nearly 50% disk storage savings and 3x faster backups. With Modern Backup Storage, Azure Backup Server goes a step further in enhancing enterprise backups by completely restructuring the way data is backed up and stored.

How does Modern Backup Storage work?

Add volumes to Modern Backup Storage and configure Workload Aware Storage

Begin backing up by creating Protection Group with Modern Backup Storage

With these simple steps, you can efficiently store your backups using Modern Backup Storage technology.

To Learn More about Professional Services, contact us at 800-208-3617


[/vc_column_text][/vc_column][/vc_row]

[vc_row][vc_column][vc_column_text]

Azure SQL Data Sync Refresh

By Joshua Gnanayutham as written on azure.microsoft.com
We are happy to announce our Azure SQL Data Sync Refresh! With Azure SQL Data Sync users can easily synchronize data bi-directionally between multiple Azure SQL databases and/or on-premises SQL Databases. This release includes several major improvements to the service including new Azure portal support, PowerShell and REST API support, and enhancements to security and privacy.
This update will be available for selected existing Data Sync customers starting June 1st. It will be available for all customers by June 15th. Please email us with your subscription ID if you’d like early access.

What’s new?

Data Sync on the new Azure portal

Data Sync is now available in the new Azure portal for select internal customers. This will be available for all customers in mid-June. You can now manage Data Sync in the same place you manage all your other Azure resources. Data Sync will be retired from the old portal after July 1, 2017.

PowerShell programmability and REST APIs (Available July 2017)

Previously in Data Sync, creating Sync groups and making changes had to be done manually through the UI. This could be a tedious, time consuming process, especially in complex Sync topologies with many member databases or Sync groups. Starting in July Data Sync will support PowerShell and REST APIs which developers can leverage to make these tasks faster and easier. This is also great for the many users who are comfortable with and prefer using PowerShell.

Better security, better privacy, better resilience

In the previous design, Data Sync used a central shared database for each region to manage the Sync operations. Now each user will have dedicated user owned Sync Databases. A Sync Database is a customer owned Azure SQL Database. By replacing the shared central databases with customer-specific databases, we provide better privacy and security. In addition, this provides the user flexibility to increase or decrease the performance tier of the Sync Database based on their needs.

Sync Database Requirements

  • Azure SQL Database of any service tier
  • Same region as the Hub Database of a Sync Group(s)
  • Same subscription as Sync Group(s)
  • One per region in which you have a Sync Group (Hub Database)

Enhanced monitoring and troubleshooting

We have made a few key improvements to monitoring and troubleshooting. Users can now monitor the sync status programmatically using PowerShell and RESTful APIs. In addition, we’ve improved several error messages, making them more clear and actionable.

Next steps

New users

If you would like to try Data Sync refer to this tutorial.

Existing users

Existing users will be migrated to the new service starting June 1, 2017. For more information on migration look at the blog post “Migrating to Azure SQL Data Sync 2.0.”
 

Looking for a technology partner to assist with a specific project? Call Managed Solution at 800-208-3617  or contact us to schedule a full analysis on the performance of your network.


[/vc_column_text][/vc_column][/vc_row]

Contact us Today!

Chat with an expert about your business’s technology needs.