Meet the Tech Exec: Ken Lawonn, Senior Vice President and Chief Information Officer, Sharp HealthCare

[vc_row gmbt_prlx_parallax="up" font_color="#ffffff" css=".vc_custom_1501859784808{padding-top: 170px !important;padding-right: 0px !important;padding-bottom: 190px !important;padding-left: 0px !important;background: rgba(55,82,161,0.66) url(https://managedsolut.wpengine.com/wp-content/uploads/2017/08/CIO-Interview-header-Managed-Solution-1.jpg?id=) !important;background-position: center !important;background-repeat: no-repeat !important;background-size: cover !important;*background-color: rgb(55,82,161) !important;}"][vc_column][vc_column_text]

MEET THE TECH EXEC

Ken Lawonn

Senior Vice President and Chief Information Officer, Sharp HealthCare

[/vc_column_text][/vc_column][/vc_row][vc_row css=".vc_custom_1501859913491{background-color: #e0e0e0 !important;}"][vc_column width="1/2"][vc_column_text]

To download the full magazine and read the full interviews, click here.
Ken Lawonn is the Senior Vice President and Chief Information Officer for Sharp HealthCare. In his role he is responsible for continuing to move Sharp forward in the implementation of advanced technologies to support the transformation of healthcare through the Sharp Experience. He joined the Sharp team in February 2014.
Lawonn has over 35 years’ experience in healthcare technology leadership. Prior to joining Sharp, Lawonn served as the Senior Vice President for strategy and technology at Alegent Creighton Health in Omaha, NE. Under his leadership, Alegent was recognized as a leader in the deployment of technology to support integrated clinical care. Lawonn also served as the Vice President and Chief Information Officer for Banner Health and Lutheran Health Systems in Fargo, ND.
Lawonn received his bachelor’s degree in computer information systems from Moorhead State University in Moorhead, Minnesota and an MBA from the University of Nebraska. He is a member of the College of Healthcare Information Management Executives (CHIME) and a fellow in the American College of Healthcare Executives (FACHE). 

[/vc_column_text][/vc_column][vc_column width="1/2"][vc_single_image image="18528" img_size="large" alignment="center"][/vc_column][/vc_row][vc_column][vc_column_text]

What are the top 3 areas of focus for IT executives?
Security is one of the top 3 areas because it can cause both financial and reputational damage. Analytics is a big focus for us, especially in healthcare because we spend so much time automating our data, now we have to spend time figuring out how to leverage that data. The 3rd one is incorporating a digital strategy and how we transform operations. In healthcare, we have to move out of this operational model toward a digital model and how we leverage clinical data to make better decision making. Healthcare is data rich, knowledge poor. We have all this unstructured data and have to figure out how we bring all of that data together.
What’s your take on Public Cloud?
We have been hesitant because of privacy issues. Early on the public providers couldn't support business associate agreements. Cloud is revolutionizing the way computing is provided. It's changing the way we think about computing. Hybrid clouds have shifted an alternative view to the public cloud. It's really large scale computing served up on demand. It's changing the way we think about it and the way other providers think about computing services. I think It's the early phase of where we are headed.
We don't use Office 365 yet. We use SAAS and most are run in private datacenters. We are looking at if we really need to own this stuff and run a mixture of cloud services and on premise services. We are moving out of our primary datacenter. We are looking at both Las Vegas and Phoenix because of cost and concerns about environmental issues like earthquakes and fires, they can protect us in those areas.
What superpower do you want most?
I don't have any interest in having a superpower. Reading people's minds might be kind of fun. I always felt you're better to be seen as more of a partner and equal. If you have a superpower you come across as superior and is truly hard to be effective that way.
When you were a kid what did you want to grow up to be?
After I got through the Hercules phase, I really wanted to be a major league baseball player, center field. I just couldn't hit a curve ball. I've always been a NY Yankee's fan. 
How is IT helping to drive revenue through the company?
We are looking at taking our current assets to expose those services to more people and make them more readily available through things like telehealth, video based, online services which allows us to extend services without having to build new buildings, or have people come to us. Using technology makes things more convenient for individuals. You can schedule an online visit with a physician, or a nurse practitioner and use it for follow up visits. It doesn't always make sense for you to come back in. Very easily you can do things online, at your convenience, and even after hours. We use technology to understand if we are providing the best treatment, and make sure we are not penalized or making sure we are effectively leveraging our payment process. and that we are effectively leveraging technology to help increase revenues.
We partner with Cerner primarily and do some work with Allscripts. We look at what technologies can run those platforms, then we pick a storage partner to work with. It's not unusual for a healthcare organization to have hundreds of applications that they are supporting.
We are hearing so much about the internet of things – what does or could the internet of things for your business look like?
We see it as huge. Both in what it is able to provide us and the elements to support it. Today we have invested heavily into integrating medical devices into our electronic data records, from pumps to monitors. We are going to actually make all the devices able to communicate, in kind of a standard that we can accept and look at. We have chronic patients with diabetes or congestive heart failure. We need to keep track of them at home to see if they are weighing themselves, etc. We have devices that can relay that information automatically to us so we don't have to go out to their home. We see tremendous advantage in using those kinds of capabilities. We can monitor and track patients to provide better care at a lower cost. Having a connected world of all these devices, helps healthcare leverage continued monitoring and the movement toward consumer involvement in their health. Some data is meaningless so we are learning how do we collect, interpret and leverage that data. Adding more data that in unactionable or not meaningful is a challenge. 
Are there hiring challenges based in the economy we’re currently facing today? Or is it a challenge of finding the right skillsets and expertise?
In our business, it's often a combination of skillset and cultural fit. The provider side of healthcare doesn't always pay the best compared to biotech companies. We are looking for people that are attracted to serving and helping people. We hire more for fit. Our challenges are in a couple areas like management level positions, and those high sought after skills like security, data scientists and web developers. We mostly hire Southern CA based individuals and they don't need a healthcare background in certain positions. We are very service oriented and deliver the Sharp experience, which is our brand- a team approach. You can't bulldoze change. We are an organization that changes very slowly.
What kind of messaging is coming down from the CEO/Key Executives about their partnership with IT?  
We've gone from IT being a backend service, to being partners with the business, which is still a critical approach. It is the engine for business transformation and growth. There is hardly anything we look at that doesn't have technology involved in some way. There is so much technology available, pick the right technology and hold people accountable to leverage it and provide value. We have to think about how we bend the cost curve. We can't keep increasing the spend if there isn't some return. We have to work together and they want us to make it simple, make it work. How do we transform to become a different kind of business, become more of a digital, real time business. You can't just keep adding on cost. Everyone likes to add stuff, nobody likes to take things away.
CIO's are becoming more like change agents and transformational leaders. The message is we've got to be nimble, faster, more accessible. One of the biggest challenges is everybody seems to be an expert in technology today, with our watches, our phones, they have this sense about what things should be able to do and they don't always understand the complications in how to make it work and make it easy to use.
Has the idea of using cloud changed your mindset of using outsourced/Managed Services?
It's changed the outsourcing model, we used to think it was turning it over to someone else. Now we think of purchasing services, and renting storage as a service. It's just different thinking. The message we keep saying is let's stop worrying about who owns it or where is it physically. Let's think about what's the best way to provide a service to our organization and that's just different today than it used to be.
If you could give guidance to any IT Manager/Director about how they position their careers what would you tell them?
If they want to be successful they have to invest in knowing the business and the customers they are serving and how to form a partnership with key business leaders to support, grow and sometimes transform that business. They are relying on you to help them understand what can be applied and what the requirements are and how to leverage them. Think about your resources differently. Do you have to own it or can you rent it.

[/vc_column_text][/vc_column][vc_row][vc_column][vc_cta_button2 h2="" title="LEARN MORE" color="belizehole" accent_color="#ed884e" link="url:http%3A%2F%2Fwww.managedsolution.com%2Fmeet-the-c-level-interview%2F|title:managedsolution.com|"]

MEET THE TECH EXEC INTERVIEWS

Managed Solution is conducting interviews as part of an outreach initiative to share trends and engage technology enthusiasts in the southwest.

[/vc_cta_button2][/vc_column][/vc_row]

Meet the Tech Exec: Joe Beery, Senior Vice President and Chief Information Officer, Thermo Fisher Scientific

[vc_row gmbt_prlx_parallax="up" font_color="#ffffff" css=".vc_custom_1501859784808{padding-top: 170px !important;padding-right: 0px !important;padding-bottom: 190px !important;padding-left: 0px !important;background: rgba(55,82,161,0.66) url(https://managedsolut.wpengine.com/wp-content/uploads/2017/08/CIO-Interview-header-Managed-Solution-1.jpg?id=) !important;background-position: center !important;background-repeat: no-repeat !important;background-size: cover !important;*background-color: rgb(55,82,161) !important;}"][vc_column][vc_empty_space][vc_column_text]

MEET THE TECH EXEC

JOE BERRY

Senior Vice President and Chief Information Officer

Thermo Fisher Scientific

[/vc_column_text][/vc_column][/vc_row][vc_row css=".vc_custom_1501859913491{background-color: #e0e0e0 !important;}"][vc_column][vc_column_text]

To download the full magazine and read the full interviews, click here.
Joe Beery was named Senior Vice President of Information Technology and Chief Information Officer for Thermo Fisher Scientific in 2014, following the company’s acquisition of Life Technologies. During his tenure as leader of Thermo Fisher’s global Information Technology infrastructure, Joe has been instrumental in cultivating unified, “One Team” culture, emphasizing and improving operational reliability, and focusing on innovative strategies, such as the implementation of eCommerce and Cloud solutions, to best serve the business and its customers
Joe served as the Senior Vice President of Information Technology and Chief Information Officer for Life Technologies since taking the position at Invitrogen in 2008. In this role, he directed the development and operation of the company’s information technology systems and eCommerce website. He also spearheaded the implementation of IT programs to support the merger of Invitrogen and Applied Biosystems in 2008.
Prior to Invitrogen, Joe was Chief Information Officer at US Airways and America West Airlines for 10 years. Previously, he spent 10 years at Motorola Semiconductor, holding various positions in the computer integrated manufacturing group, and also served as a manufacturing and software engineer at NV Philips in Albuquerque, New Mexico.
Joe holds a bachelor’s degree in business administration and business computer systems from the University of New Mexico. He is a member of the Board of Rare Genomics a nonprofit supporting the use of Genomics in diagnosing rare disease and previously on the Board of CypherGenomics, a software and services in Next Generation sequencing.

[/vc_column_text][vc_column_text]

What is your focus this year?

You have to go into a deeper IT perspective, we do a lot of software development and cloud for all of our integral platforms. One of our biggest priorities is IT transformation within the organization and with our customers. We are moving to a digital science company.
We have a very large portfolio of major manufacturing, it’s a very product focused company. Our biggest priority, when we think of our customers and the future, is the integration of our platforms. They are searching for an answer, and the answer is a digital answer.
We have the largest life sciences cloud in the industry, with over 50K users, launched about a 1.5 years ago. First priority is the integration of our business with our IT capabilities. Scientists store data in the cloud, our global platform, attaching that is a science capability.  We’re connecting the entire workflow with consumables and analytics. The cloud to our customers is incredibly important. With customer and data analytics, we have the ability to see not only what a customer spends but what kind of experiments they are doing.
The IT organization is in the middle of every one of the company’s long-termsteps, whether it’s our sales process, it’s all connected. We are merging technologies, with over 150K users, providing our customers with better services. The first priority is driving the company to a digital business. It’s huge. It’s really something that we’re moving through as quickly as possible.

Where does the Cloud fit in your organization?

We are one of the largest life sciences customers of AWS. We are moving away to not do cloud on premise. It’s all done with major partners. Around our strategy, with over 2K IT people globally we decided to invest. We are actively moving our E commerce capabilities and internal systems to the cloud- to AWS. For a company of our size, we surprisingly have a small data center footprint. Our large portfolio is not completely in the cloud. We have a Cloud First Strategy. I don’t sign a purchase agreement for any hardware equipment on premise. We classify our 3 part strategy as Born in the Cloud, Transitioning to the Cloud and then Cloud as an Infrastructure.  Can a provider, be our long term future? We believe so.  We are going through HIPPA and all the security steps. To really get the value out of the cloud you have to use somebody else who does it really well. Amazon can hire more people in their security group than I can hire in my IT organization in a single year.

How do you view security from an identity management perspective? 

We continue to look at that over and over, we are thoughtful about our customers. Thermo Fisher Scientific is now a conglomeration of thousands of acquisitions. Everything is driven through thermofisher.com, we have one identity management strategy through that platform that is secure.  Internally we are using an Oracle platform.  It’s a journey and there has to be patience, you do the best job working with your security partners. I just hired a new CSO.  I’m running as fast as I can to pull those platforms together. It’s a constant journey. When people are logging in to use systems, there are different passwords, yeah, it’s just where we are.

As a CIO, as you look at the transition of your background. What have you had to tweak in your thought  process? Things like becoming more fiscally responsible since you are more OpX spend, things like that? Have you felt you have to take on a different leadership style because of cloud? 

Yeah, I’ve only really been in 3 industries. As a CIO of my tenure, it’s interesting, I haven’t jumped around a lot. Semiconductors, airlines and now life sciences. Using emerging technologies and advanced technologies that we have available to us today, the biggest challenge is retooling the leadership to think in one particular way. It’s two things but the first is you have to move absolutely as fast as possible. You have to move your investment profile from infrastructure to value added activities. You have to move from legacy datacenter, heavy metal capability, to what are the things that are going to move the needle for the business. None of our budgets get that much bigger, they grow through inheriting companies or a big initiative. The biggest challenge is hiring leadership, that all the things we used to do 5-10 years ago have to substantially change in order to reinvest. In the future people will hire drastically different and make the transformation in thinking because maybe you could ride that on premise horse for a little longer, but if you don’t get off,  you’re really going to lose the battle.  Everybody is looking to take advantage of that different profile. That is the mantra executives are getting. Same amount of dollars, more capability, now the challenge is OPX vs. CAPX. We feel like we’re ahead of the game, we projected this.

What is your CEO asking from you?

 My CEO is looking to move to cloud strategy faster.  We feel good about the position we are in. The CEO never thought we would say this but we’re the competitive advantage for the company. The biggest concern that’s on their mind is it’s an incredibly target rich environment because we can do things better than we’ve done before. You really can think differently. That’s what we’re trying to get our heads around. In the next 5 years it’s going to be drastically different. You won’t have silos, you will really drive your business differently. The question is what do we go after first that’s going to drive the best amount of value. The speed we can move is so much faster, you don’t want to make mistakes, you want to invest in the right things. We can go a lot faster than we’ve gone in the past. Your investment dollars aren’t going to go up much, a little but not much. It’s about if it’s being put in the right place. We focus on 3 things. We operate as 1 team, win with the basics – cloud has to have reliability, scalability. The 3rd thing is what are you going to do to move the needle? Those are all great- what will drive the greatest amount of value using emerging and advanced technologies.

You’re gathering all this big data that sits in the cloud. Where are you getting that data from? Are you able to govern and look at data analytics, talk to me about IOT. 

 IOT is very interesting in some cases many industries have been doing it for a while. It’s more enabled now. The way we think of that is how we think of the lab of the future, changing the way our customers view our products. That could be anything from sophisticated instruments to anything we see in their workflow.  How do we automate that workflow?  Use the internet of things to drive that a lot faster.  Mobility is really driving us internally more than anything. How do we automate our customers workflow. With BI and analytics it’s really a target rich environment for us both internally and externally. We are connecting all of our salesforce all around lead management. We are in silos.  We did a lot of work crunching our CRM data and systems to better enhance our sales agents to manage leads. When you look at the customers we have and at the complexity of our portfolio that’s a great opportunity for us. As we continue to roll out the cloud it gets us closer to the customer.
That is the beauty of where Thermo Fisher is. We have the biggest portfolio. If you look at the actual scientific analysis. We are using genomic data to help doctors and clinicians do a better job at servicing their clients. We are at the very beginning of learning how big data can learn from data, connect them together and ultimately drive better decisions.  The product in airlines was a seat. You measured every seat every mile. Semiconductors was a semiconductor. You sold power controllers. In our world, we sell these products, have warehouse full of products. At the end of the day it’s all a digital answer.  What type of cancer do I have and what is the best treatment.
That’s why I love the company, the industry, this is the last place I will ever work. When you look at what we do, the mission is to make the world healthier, cleaner, safer, and that is all digital. So this is going to change the way you live and I live. I have 2 kids alive today because of what the company does. They were both born with a rare genetic disorder, and because of our instruments they found the diagnosis, the rare genetic mutation and treated it and they take medication 3 times a day and it was this digital answer. When I think of the future of big data you see a future that is really different for everyone.

Talk to me about back up and disaster recovery.

For on premise things we have SunGuard and the normal capabilities. We think of AWS and Microsoft, the investment profile that they’re putting into this capability far exceeds anything else that anyone can do. What are we doing differently, how are we investing- It’s all of those dollars our execs want us to trim to invest in what will move the needle. We have a great relationship with Microsoft, Oracle, and AWS and over time we will continue to mix it up.

Any challenges with hiring?

You have to groom them from within. It’s really exciting internally here. We had over 200 people go over DevOps training in the Amazon platform. We have very few data centers. From a training perspective it’s a pretty exciting opportunity. You have to train them yourself, it will pay off, there just isn’t enough folks out there right now. We have a pretty aggressive training platform.

[/vc_column_text][/vc_column][/vc_row][vc_column][vc_column_text]

[/vc_column_text][/vc_column][vc_row][vc_column heading_color="primary-1"][vc_empty_space][grve_callout title="Tech Spotlight Interviews" button_text="Learn more" button_link="url:http%3A%2F%2Finfo.managedsolution.com%2Fc-level-interview-registration||target:%20_blank|"]IT is a journey, not a destination. We want to hear about YOUR journey!
Are you a technology innovator or enthusiast?
We would love to highlight you in the next edition of our Tech Spotlight.[/grve_callout][/vc_column][/vc_row]

Meet the Tech Exec: Todd Stewart, Vice President of Global Infrastructure and IT Operations, Western Digital

[vc_row css=".vc_custom_1501859913491{background-color: #e0e0e0 !important;}"][vc_column width="1/2"][vc_column_text]

To download the full magazine and read the full interviews, click here.

Todd is Vice President of Global Infrastructure and IT Operations for Western Digital, the world’s largest data storage company.  In this role, he is responsible for on-premise and cloud computing, global network, communications, data storage and data centers.  Additional responsibilities include support for high-performance, engineering, and manufacturing computing, as well as desktop productivity and end-user service and support.  Prior to Western Digital, he was responsible for IT Operations and Infrastructure for AMN Healthcare, Amylin Pharmaceuticals, and Siebel Systems.  Todd holds an MBA from the University of Georgia. [/vc_column_text][/vc_column][vc_column width="1/2"][grve_single_image image="17940"][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]What’s the #1 area of focus CIO’s should concentrate on?

CIOs are really the technology CEO of their company. The primary goal of a company is to attract and retain customers. Therefore, the CIO should be highly focused on the business strategy for attracting and retaining customers. Many executive peers remain technologically naïve – a CIO has to understand every part of the business, every function, and then be able to advise those functions on meeting those goals through technology. The most successful CIO will truly understand every business function.

What’s your take on Public Cloud?

WDC is a thought leader with public cloud, we are a very large customer of AWS, and we also hold the world’s record for the largest computing job ever run in AWS (we’re broken our own record several times). We are increasing our usage of Azure – Microsoft is really making progress making that service more relevant. Actually, Microsoft is a completely different company from 3 years ago, in a good way. Public cloud is one of several tools for corporate computing, but not one that solves every need – or will ever solve every need. This is a hybrid world.

What areas come to top of mind today when looking at Public Cloud?

Public Cloud isn’t cheap. Some things run cheaper on premise. Sometimes you have capital to spend, and public cloud is only OpX. Fit the right tool to the right job, this is a hybrid world.

Do you feel IT still carries the title of a cost center rather than revenue driver?

That solely depends on the CIO. WDC is a merged combination of WD, HGST, and SanDisk. 3 years ago, WD and HGST viewed IT solely as a cost center to be minimized. Now, our company views IT as a key enabler for marketing, sales, manufacturing, business intelligence, supply chain, and productivity. This change occurred because our CIO and his team set out to understand every facet of our business, and then focused on delivering innovation at every opportunity. “Keeping the lights on”, low-value work was moved to SaaS and cloud, allowing our internal resources to focus on innovation and delivering value.

What are you (the CIO) doing to support innovation in the company and its own organization to deliver better solutions?

Understanding and getting directly involved in every facet of the business, and then using that knowledge to craft innovative solutions that solve problems. Examples:
a. highly-complex, cloud engineered big data solutions leading to improved manufacturing yields and improved product quality
b. SaaS hosted applications that allow people to be productive, anywhere on any device
c. Manufacturing application consolidation and virtualization leading to reduced cost and higher uptime for factories

We are hearing so much about the internet of things – what does or could the internet of things for your business look like?

We love the IoT, because all of that data ends up on our products! WDC now views itself as a key enabler of future progress. IoT will provide gigantic data sets – which our products will host – that will yield advancements in medicine, technology, economics, science, and every other field. The solution to every problem is somewhere in that data swamp, and we will be part of pulling knowledge from it. Internally, IoT has already been helping us for years in our manufacturing processes.

Are there hiring challenges based in the great economic we’re currently facing today?

We are having good success recruiting good people. The globalization of the labor market helps somewhat, but we are able to hire leaders in the US as needed.

Has the idea of using cloud changed your mindset of using outsourced/Managed Services?

Our preference for providing technology services is first to use SaaS, so we can get totally out of that business (email, HRIS, file sharing, etc). Then either cloud or on-premise depending on the ROI and circumstances. We find value in specific managed services, such as network monitoring/NOC services or system administration. There is a place for managed services, because cloud isn’t going to do it all.

If you could give guidance to any CIO, IT Manager Director about how they position their careers what would you tell them?

Know your technology – that’s a given – but you must know your business. Get involved with them, understand their goals and challenges, and seek to address them. Most people will accept the help when it comes from a spirit of partnership and mutual success. Failure to do so just leads to “shadow IT” – they will do your job themselves.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_row_inner css=".vc_custom_1526366999142{padding-top: 50px !important;padding-right: 50px !important;padding-bottom: 50px !important;padding-left: 50px !important;background-color: #ec884f !important;}"][vc_column_inner][grve_callout title="MEET THE TECH EXEC INTERVIEWS" heading_tag="h2" heading="h2" button_text="LEARN MORE" button_color="green" button_hover_color="white" button_link="url:http%3A%2F%2Fwww.managedsolution.com%2Fmeet-the-c-level-interview%2F|||" el_class="txt-white"]Managed Solution is conducting interviews as part of an outreach initiative to share trends and engage technology enthusiasts in the southwest.[/grve_callout][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row]

Meet the Tech Exec: Alex Bates, Senior Director of Software Development, Aspen Technology

[vc_row gmbt_prlx_parallax="up" font_color="#ffffff" css=".vc_custom_1501859784808{padding-top: 170px !important;padding-right: 0px !important;padding-bottom: 190px !important;padding-left: 0px !important;background: rgba(55,82,161,0.66) url(https://managedsolut.wpengine.com/wp-content/uploads/2017/08/CIO-Interview-header-Managed-Solution-1.jpg?id=) !important;background-position: center !important;background-repeat: no-repeat !important;background-size: cover !important;*background-color: rgb(55,82,161) !important;}"][vc_column][vc_column_text]

MEET THE TECH EXEC

Alex Bates, Senior Director of Software Development, Aspen Technology

Former CTO of Mtell, Mtell was aquired by Aspen Technology

[/vc_column_text][/vc_column][/vc_row][vc_row css=".vc_custom_1501859913491{background-color: #e0e0e0 !important;}"][vc_column width="1/2"][vc_column_text]

To download the full magazine and read the full interviews, click here.
Alex brings a unique perspective to Mtell at the intersection of diagnostics, machine learning, and the IoT. He pursued his interest in adaptive learning systems as an undergraduate, performing DARPA-funded research in neural networks, vision, and biomedical diagnostics, authoring several peer-reviewed publications. In 2000 he jumped into the private sector, applying analytics on some of the world’s largest data warehouses at Teradata, a pioneer in parallel database technology. 
In 2006 Alex co-founded Mtelligence (now Mtell) to harness the deluge of sensor data in the industrial IoT, with a mission to create a world that doesn’t break down. Mtell’s machine learning platform is used to monitor global fleets of offshore drilling rigs, railroad engines, and process equipment, in effect creating a distributed immune system to protect equipment and personnel.  A principal architect of the system, Alex is lead inventor on several patents in the area of sensor networks and machine learning. Mtell was acquired in 2016 by AspenTech, the global leader in process optimization software.
Alex enjoys sharing experiences with current and future entrepreneurs and technologists, and participates in mentoring through Entrepreneur Organization (EO), Startup Leadership Program, and other organizations.  Alex received degrees in Mathematics and Computer Science with a concentration in neuroscience.

[/vc_column_text][/vc_column][vc_column width="1/2"][vc_column_text][vc_single_image image="17925" img_size="large" alignment="center"][/vc_column_text][/vc_column][/vc_row][vc_column][vc_column_text]

When you were a kid what did you want to grow up to be?
When I was around 7, I was 100% sure I wanted to be a ninja. I grew up in a lot of different places, Portland, Oakland, LA, NYC, Madison. Dad was a journalist, and would move the family around.
Are you leveraging the cloud?
There is a cloud initiative, and various products have a certain degree of cloud enablement and it's certainly a priority to support that. We will need to support a hybrid model, before the recent merger with Aspen we used Amazon Web Services, most of our cloud use was for pilots and proof of concepts and our big customers like Exxon Mobil would use big data centers, in the Industrial realm, more adoption of small to medium sized companies, enterprises are guarded. They can afford it but the attitude has kind of been, it's safe, but is it safe.
If you were stranded on a deserted island, what 3 things would you bring and why?
Water purifier, solar producing unit, and my laptop. The boring answer. I figure I could scrounge and produce food, maybe a fish hook, I could probably make one. I've seen enough reality shows, I'm an engineer so I could figure out how to make shelter. The non-boring answer would be a boat and water toys.
Do you feel IT still carries the title of a cost center rather than revenue driver? Do you see a shift as IT being a revenue driver?
For us, it's a little bit more of an enabler, especially with cloud computing and DevOps. With DevOps there's more support of software, so that has shifted from just IT staff, it's now intersected with engineering.
Big data analytics? How do you leverage that?
Some customers are more paranoid about hoarding their own data and others are more open to sharing it because they get greater benefits. We want anywhere from a couple months to a year, industrial plants, you are going to want to know seasonal variables. You don't need all 4 seasons but the more the better. We learn failure patterns, we can collect them across customers, so we don't need every customer to have those.
What superpower do you want most?
I've always been interested in AI, a superhuman intelligence. Ideally one that could expand without limits, a greater capacity beyond the brain.
When you think about delivering the best solution to your customers, what does that mean to you?
For us, our customers, we travel out and get a whole new appreciation for the challenges they face. You can meet them on a drilling rig, you get a whole new appreciation for what they face. That movie Deepwater Horizon that is out, their lives are on the line, head people are responsible for many lives, for us, our software is supposed to help reduce risks and predict catastrophic incidents. For us to deliver value, we have to have state of the art monitoring capabilities, and integrate with their work progress so things don't slip through the cracks, so we have to deliver value for improving equipment, those are some of our metrics.
Are there hiring challenges based in the great economic we’re currently facing today?
The A+ players are always highly sought after, companies go after high caliber individuals. Some people have this perception that great candidates they graduate and go to the Bay Area, there might be some truth to that but we are developing that locally. We are also hiring in Boston. We hire data scientists which is highly sought after, we compete with Amazon, IBM, always competing for talent.
Has the idea of using cloud changed your mindset of using outsourced/Managed Services?
Offloading a lot of the IT Support, I think that everyone is warming up to the idea in the industrial realm. Especially on the S-M size, no one can support that. For big data, the costs are still a little bit high, to truly host that in the public cloud. Some of the technologies are sort of abstract. The best scenario would be to use public cloud for elastic computing, scale it back down, figure out how to transfer the data. But I definitely think it's the wave of the future.
What does IOT mean to you?
That is the core of our business. Industrial IOT is what we offer. Our customers have lots of distributed assets that produce products or transform raw materials, we have all these connected devices, there are a lot of enabling factors, sensors and storing big data. Certainly it's fundamental on an industrial level.

[/vc_column_text][/vc_column][vc_row][vc_column][vc_column_text][vc_cta_button2 h2="" title="LEARN MORE" color="belizehole" accent_color="#ed884e" link="url:http%3A%2F%2Fwww.managedsolution.com%2Fmeet-the-c-level-interview%2F|title:managedsolution.com|"]

MEET THE TECH EXEC INTERVIEWS

Managed Solution is conducting interviews as part of an outreach initiative to share trends and engage technology enthusiasts in the southwest.

[/vc_cta_button2][/vc_column_text][/vc_column][/vc_row]

Managed Solution Named to 2017 CRN Fast Growth 150 List

[vc_row][vc_column width="1/2"][vc_column_text]

San Diego, CA – CRN®, a brand of The Channel Company, has named Managed Solution to its 2017 Fast Growth 150 list. The list is CRN’s annual ranking of North America-based technology integrators, solution providers and IT consultants with gross sales of at least $1 million that have experienced significant economic growth over the past two years. The 2017 list is based on gains in gross revenue between 2014 and 2016, and the companies recognized represent a total, combined revenue of more than $16,717,688,643.
“The companies on CRN’s 2017 Fast Growth 150 list are thriving in what is now a very tumultuous, demanding IT channel climate,” said Robert Faletra, CEO of The Channel Company. “This remarkable group of solution providers has successfully adapted to a landmark industry shift away from the traditional VAR business model to a more services-driven approach, outpacing competitors and emerging as true channel leaders. We congratulate each of the Fast Growth 150 honorees and look forward to their continued success.”
Managed Solution is proud to receive this acknowledgment of the company's continuing success. Managed Solution is a Technology as a Service (TaaS) company, offering hardware, software and premium service tiers and specializes in forecasting technology infrastructure for small to enterprise sized businesses. Managed Solution provides a full spectrum of managed and professional services with an award-winning US-based 24/7 Help Desk headquartered in Southern California.  Managed Solution provides IT services nationwide and is recognized as one of the top 10 National Cloud Service Providers.

[/vc_column_text][/vc_column][vc_column width="1/2"][vc_single_image image="17624" img_size="large" alignment="center"][/vc_column][/vc_row]

Meet the Tech Exec: Vinton G. Cerf, Vice President and Chief Internet Evangelist, Google

[vc_row gmbt_prlx_parallax="up" font_color="#ffffff" css=".vc_custom_1501859784808{padding-top: 170px !important;padding-right: 0px !important;padding-bottom: 190px !important;padding-left: 0px !important;background: rgba(55,82,161,0.66) url(https://managedsolut.wpengine.com/wp-content/uploads/2017/08/CIO-Interview-header-Managed-Solution-1.jpg?id=) !important;background-position: center !important;background-repeat: no-repeat !important;background-size: cover !important;*background-color: rgb(55,82,161) !important;}"][vc_column][vc_column_text]

MEET THE TECH EXEC

Vinton G. Cerf, Vice President and Chief Internet Evangelist, Google

[/vc_column_text][/vc_column][/vc_row][vc_row css=".vc_custom_1501859913491{background-color: #e0e0e0 !important;}"][vc_column width="1/2"][vc_column_text]

To download the full magazine and read the full interviews, click here.
At Google, Vint Cerf contributes to global policy development and continued spread of the Internet. Widely known as one of the "Fathers of the Internet," Cerf is the co-designer of the TCP/IP protocols and the architecture of the Internet. He has served in executive positions at the Internet Society, the Internet Corporation for Assigned Names and Numbers, the American Registry for Internet Numbers, MCI, the Corporation for National Research Initiatives and the Defense Advanced Research Projects Agency and on the faculty of Stanford University.
Vint Cerf sits on US National Science Board and is a Visiting Scientist at the Jet Propulsion Laboratory. Cerf is a Foreign Member of the Royal Society and Swedish Academy of Engineering, Fellow of the IEEE, ACM, American Association for the Advancement of Science, American Academy of Arts and Sciences, British Computer Society, Worshipful Company of Information Technologists, Worshipful Company of Stationers and is a member of the National Academy of Engineering.
Cerf is a recipient of numerous awards and commendations in connection with his work on the Internet, including the US Presidential Medal of Freedom, US National Medal of Technology, the Queen Elizabeth Prize for Engineering, the Prince of Asturias Award, the Japan Prize, the Charles Stark Draper award, the ACM Turing Award, the Legion d’Honneur and 29 honorary degrees.

[/vc_column_text][/vc_column][vc_column width="1/2"][vc_column_text][vc_single_image image="17635" img_size="large" alignment="center"][/vc_column_text][/vc_column][/vc_row][vc_column][vc_column_text]

What did you want to grow up to be when you were a kid?
By the time I was 10, I knew I was going to be a scientist of some kind. I just didn't know what kind. I was intensely interested in math and chemistry. When I was 10 it was 1953 and I know that sounds like just before the Civil War. I got a Gilbert Chemistry Set, and back then you got some really great stuff in the chemistry sets. We could make thermite grenades and other fun things. I got really excited about all that. Stuck with the math and science and chemistry all through high school, then when I went to Stanford, I fell in love with computing. I spent as much time as I could in computer science even though I took a Math Degree, so knew I was going to go into that, didn't know exactly how. I didn't even meet a computer until I was 15, which for 15 year-olds now would be really weird. Even worse, the one I met was made out of tubes as this was before transistors.
How did you meet Robert Kahn, the man you developed TCP/IP with?
 We met initially at UCLA. He was working for a company, Bolt Beranek and Newman, responsible for the design of the Arpanet Packet Switch. We are talking about 1968/1969 and I was at UCLA working on a PHD in computer science. I was hired to be the principal programmer for what became the Arpanet Network Measurement Center and the man who hired me was Leonard Kleinrock, one of the founders of Linkabit. Eventually we got Arpanet up and running with four nodes. In early 1970, Bob Kahn came out to kick the tires of this little four node Arpanet cause he had some theories that it was going to congest or run into other problems. His colleagues said that would never happen; that it would be like all the oxygen molecules going to the center of the room and everybody suffocating. As the network measurement guy, my job was to go write programs that would drive traffic into this network to see what the response was. We killed the network repeatedly with various traffic patterns. We had names for some of these. One of them was, "The Mexican Standoff". Two of the packets switches couldn't send to the other one because they were full of traffic and there was no place to store incoming traffic. There were a series of things like that that dictated changes to that initial design like internal congestion control.
So fast forward to 1972. I’m at Stanford now joining the faculty and Bob Kahn is joining DARPA, to start what became the "Internetting Program" which was initiated because the defense department at DARPA figured out they wanted to use packet switching for command and control. That means that computers are going to be in ships, airplanes, sea, and mobile vehicles and all we had done was connect computers in fixed locations in air conditioned rooms. So that was not going to serve for command and control and that also means voice, video and data. So Kahn is developing a mobile packet radio system and packet satellite system. So guess which company is responsible for doing system engineering for the packet satellite network? Linkabit. So that's how I met Irwin Jacobs, while this "Internetting Program" was underway.
Bob Kahn ran those programs from 1972-1976 and at that point ARPA asked me to leave Stanford and come to Washington and run the Internetting program, packet radio and satellite program. I used the Arpanet network as part of the Internet system.
On November 22, 1977, we had the first demonstration of a 3 Network Internet and had a packet radio van going up and down the Bayshore freeway, radiating packets into the packet radio net through a gateway into the Arpanet, all the way across the US through an internal satellite link into Norway and down to University College London. And then the packets popped out of the Arpanet in the UK and went into the packet satellite network and went back across the Atlantic, and down to West Virginia then went back into the Arpanet and all the way down to Los Angeles. So we're moving packets from the moving vehicle in the Bay Area down to USC and of course they've gone a hundred thousand miles because of the internal Arpanet synchronous satellite hop up and down and then another satellite hop on the packet satellite network and they've gone back and forth across the Atlantic. And it worked. Of course I was hopping up and down because my God, it worked. It's a miracle. It's always a miracle when software works. But it demonstrated that TCP/IP protocols actually worked in a multi-network environment.
The thing that made it fairly dramatic was that the networks were so different. In the packet radio network the connectivity was changing all the time and there were different error rates and delays. Packet satellites were synchronous so there is a much longer propagation delay. The packets of the Arpanet were smaller and running at 50 kilobits in the backbone. That was a major milestone.
Is Cerf's Up producing the results you anticipated? What are your plans for the future of Cerf's Up, what's next for 2018?
This is our first experimental initiative in San Diego. Ann Kerr proposed it. We planned roundtable discussions with a mix of people with expertise in specific technologies and research topics. What we discovered is that you have plenty of startups that have a better survival rate than many other places do but they don't grow very much. There are several possible reasons for that, they get acquired, or getting A round capital is still turning out to be a challenge. There is a lot of debate whether San Diego has to go out to Silicon Valley to get money for A rounds and the answer is probably yes. There is a certain amount of willingness to take risk in Silicon Valley Venture companies, that doesn't occur here. You don't yet have as many generations of startup successes compared to the Silicon Valley. When you look at these local entrepreneurs and somebody offers them $50M for the company, they'll take it. On the other hand, if you've been through this a few times and somebody offers you $50M and you think you can grow it to a billion you don't take it. When you get a few more cycles of entrepreneurs who are willing to wait, that dynamic might just change with time.
I threatened to come back in 6 months to find out what happened as a result of CERF’s Up and we should focus on including the Biotech companies next time. You have plenty of resources to make very big successes and haven't so far except for a few cases so far like Qualcomm, and they had a long time to get there. One conclusion is that it is still in the early days for the startup engine to run enough cycles through for people to take more risks. The people sitting around the table at Cerf's Up were comfortable going up to Silicon Valley although they wish they didn't have to. If it was around the corner it would be easier. It's easy for somebody to go get them. It would be nice if they had that capability. Eventually you might actually get there.  While San Diego start ups have plenty of technical talent, they don’t always have management talent for marketing or finance. They sometimes don't appear to have all the skill sets they need to grow.
You have stated, "The real heart of successful business is innovation and that large companies must absorb new ideas". You also mentioned low engagement of employees and their untapped possibility and how unless you create an empowering environment, people will disengage. What do you think about Design Thinking as a movement to elevate teamwork?
I'm a big fan of the notion of design. I'm a huge prototyping fan. You don't know anything until you try it out. We went through 4 iterations of the Internet design. The freedom to iterate and try things out is super important. No matter how hard you work on something, you never get it right until it encounters the real world. The 4th Internet iteration was the final version in 1978 but we are seeing new iterations today such as IPv6.
You learn more from failure than you do from success. If I had to choose between a CEO who has failed before or one who never has, I'd choose the CEO who failed, unless they failed all the time, because they learned from that failure. Design is at the heart of everything. Architecture is at the heart of the design with how components interact with each other. I love to get at the whiteboard, or  if I can get the design on one piece of paper and see all the parts and can actually visualize their interactions, that is heaven for me. Especially as the implementation goes on. If somebody runs over and says what if we do X, I'll get out my piece of paper and if you don't have it all on one piece of paper you won't be able to figure it out. I'm oversimplifying and not all design can be done on one piece of paper but for me it's an important objective to figure out how it all works in one view. 
Why can’t we figure out a way to write software without bugs?
The answer is we need much better software tools, to minimize the mistakes we make. I think that should be a big research project for universities. It's much more important now than ever because we are so dependent on software doing things for us that we aren't aware of. The worse part, is even if the software works as we intended it to it may encounter a situation that we didn't think about. This happens a lot in a distributed environment, with things that never interacted before. Your software may be confronted with conditions it wasn't designed to cope with. If it fails in some harmful way as a result then that is a bug too. The implication is that all the software does need to have some path for updating. And there are all kinds of challenges here to prevent the bad guy from sending the new (deliberately harmful) update. You must verify the update is coming from a legitimate source. We lack tools that help us expose mistakes in our software implementations
What are your predictions of the Internet over the next 5-10 years and beyond?
Right now, the Internet is globally penetrated at 50% (about 3.5 billion people online). I estimate that by 2020 we will hit at least 70% or even 75% and a lot of that will be on the back of smart phones with increased capacity for 3 or 4 G. Generally, the speeds of access and underlying carriage on the net are going to increase over time.  We are way far away from reaching any limits. 100 gigabits per second backbones will be running at 400 gigs to a Terabit by 2022. You will also discover cloud based systems have become the dominant part of computing for most people who are using the Internet, as opposed to computation at the edges, and that will actually turn out to be beneficial because security will turn out to be better. Your access to capacity will be much more flexible because you will have less capital expenditure. You will use more service based models. That will continue. There will be new access opportunities available by way of satellites to other areas like Africa and Latin America. The Internet of Things will theoretically grow pretty dramatically on the assumption that it doesn't become inadequately secure or too complicated to use.
Security is a huge issue for IoT, we've already seen one specific example of the failure to secure simple devices like webcams, when most of these devices had zero security or well known user names and passwords you couldn't change. The Bot Net herders found them and recruited them into a Botnet and didn't change their functionality. The webacms still did what they were supposed to do, but the data streams they were sending were re-aimed at a target in the Internet and there were cascade affects stemming from the consequence denial of service attack.
I'm spending a lot of my time insisting that people pay attention to security, access control, encryption, and strong authentication because even innocent sounding data could be hazardous to your safety. I have temperature sensors in every room in my house. Every 5 minutes they are sending data to a server, so it's my guess that if you had access to 6 months of that digital data you could figure out how often everyone that lives there comes and goes and whether they away.  So IoT will be with us. AI will not perhaps reach a crescendo but there will be enormously large numbers of tools, based on the idea of deep learning for pattern matching and other applications.
One example already exists. Take some of our (Google’s) Tensor Processing Units. We trained one of them to manage the cooling systems at one of our data centers. Normally, we would do this manually once a week and try to reduce our costs for operating the cooling pumps. We trained one of the TensorFlow systems to manage the data center cooling system and it saved 40% of costs because it was smarter and faster than a human being and in real time. Being able to use tools in the Cloud I think that will be very visible 5 years from now. Google and others will be offering that through API's. We do that now with our TensorFlow systems and GPU's and eventually with quantum computing. Healthcare is possibly going to change. Most of that will be analytic since we are looking for patterns. The whole notion of analyzing metabolites, as a way of discovering diseases is interesting. It takes a fair amount of analysis and a big database of things to compare the metabolites with. I didn't mention, augmented and virtual reality, but we will see that moving very quickly too.
The Interplanetary Internet has been in operation since 2004. This work began at the Jet Propulsion Laboratory in 1998. When the two rovers landed on Mars, in January 2004, the original design was supposed to transmit data directly from the surface of Mars back to Earth to the Deep Space Network Ibig 70 Meter dishes at 3 locations: Canberra, Australia, Madrid, Spain and Goldstone, CA near Barstow). The planned data rate which was available was 28 kilobits a second which is not much from a science point of view. It turned out when the radios were turned on they started to overheat. They didn't want to ruin the radio's. One of the guys said we have X-band radio onboard the rovers and we have X-band radios in the orbiters to map the surface of Mars to help us figure out where the rovers should go. They reprogrammed the orbiters and the rovers to be a store and forward network. Because the orbiters were closer to the rovers on Mars than they were to Earth you could get 128 kilobits between the rover and the orbiter. Because the orbiter was out of the atmosphere with bigger solar panels it could transmit at128 kilobit/second back to Deep Space Network.  So all the data from Mars has been relayed through store and forward network using new interplanetary protocols which we had been developing in anticipation of the need for that. We need really rich communication support for man and robotic space exploration.
By 2022, we will have launched additional missions to Mars, and those hopefully will be carrying Interplanetary protocols. What I hope will happen is other missions launched in the solar system will finish their scientific missions and be repurposed as nodes of an interplanetary backbone to support further exploration.  TCP/IP would not work at interplanetary distances so we had to rewrite the protocols. TCP/IP flow control was really simple: when you ran out of room you told the other guy to stop. Well if the other guy is 20 minutes away, at the speed of light, he's going to be transmitting for 20 minutes before he hears you say anything so all that data will be lost. We had to develop a new suite of protocols that we call Delay and Disruption Tolerant Networking also known as “DTN.” The disruption problem is that if you are trying to communicate with something on the surface of Mars and Mars is rotating, then you can't talk to it until Mars rotates back around again. TCP wasn't designed to solve long periods of disruption.
When Earth and Mars are closest together, we are 35 million miles apart and with the speed of light that's 3.5 minutes. When we are farthest apart in our orbits, it's 235 million miles and that's 20 minutes one way. 40 minutes round trip. So that doesn't work without these new protocols. I hope their use will be ongoing by the end of the 5-year period.
That doesn't count the other project which some of us are scratching our heads about and that is how to build a spacecraft to get to Alpha Centari in 100 years elapsed time. The current propulsion systems would take 65K years to get there, which is a long time, so we are looking at alternative propulsion systems, ion engines to get us up to about 20% the speed of light. And then we'll have to slow down when we get to the other end otherwise we'll get maybe one picture as we head further into space! Then we have to do autonomous navigation to get there.  Because you can't do midcourse corrections like we normally do in the Interplanetary systems. When we go to Mars or Jupiter, part of the way there, we check to see what trajectory looks like and we will change the instructions of the spacecraft to do midcourse corrections.
Imagine you have a spacecraft that is a light year away. And take a year to give it an instruction and it takes another  year to find out what happened. This is not exactly interactive. We have to do autonomous navigation. And then there is the thing that I care most about which is how do you generate a signal from 4.3 light years away that you can actually detect. I'm thinking how much power do I have and it can't be much because an interstellar space craft pay load can't be all that much. Suppose I could generate a 100 watt signal, but I could compress it down to ten to the minus fifteen seconds. That's a big spike that I might be able to detect that from 4.3 light years away except for one problem.  Even if it's a columnated laser it's going to beam spread. I'm going to have a very weak signal coming back cause of the spread. I’ll need to build a synthetic aperture receiver network to detect the signal and reintegrate.
Can you explain the digital dark age and how we can prepare?
Every digital object you ever met is at risk of disappearing because the bits went away or the reader of the mediuym went away or the software doesn't run anymore. There are 3 problems. One is technical: how do I consistently copy bits into new media that I can still read. How do I maintain the metadata, which identifies the encoding of this photograph and what it is a photograph of and how do I maintain the operation of old software if there is no new software that can interpret the coded information. It involves preserving operating systems and application code and the description of the hardware so I can emulate that in the cloud. Then there is the 3rd problem. What is the business model that allows you to preserve content over a long period of time.
Historically we had libraries and archives to hold a lot of this stuff in the medium they were created with. But most of those media were relatively long lasting. We have media that doesn't last long. CD ROM readers may die before discs do. 3 and 1/2 inch floppy disks still have bits on them. I finally found a floppy disk reader that I can plug into my USB port and can run it on my Mac. I found Word Perfect files but I didn't find any Word Perfect programs on my operating system that could interpret the bits. We have to start thinking a way through a regime that deals with the technical problems. There are a whole series of potentially legal barriers. Suppose you have a piece of text and suppose it is encoded as a Word Perfect file. I don't know what the patent situation is on Word Perfect. If you pull this object up and was created in 1982 do we know whether or not the software is now free for people to use. How the heck do you find that out? We have patent registries so you might be able to figure it out if a patent expired for a piece of software but it is relatively difficult. Copyright is a similar problem. Do I have the right to propagate a digital object and can I copy it. You will have to search the copyright records, but there may not be any, because you don't have to register any more. In 1976 Congress adopted the Berne Convention, authors own the copyright the instant they create a work. Sounds good unless you are somebody like me who wants to do something with this thing and you didn't register it so I can't find you. I don't know where you are or who you are so I can't do a deal. This is one problem we had with the Google Book Search Project.  Copyrights had not expired for works were out of print. We wanted to promote their visibility by digitizing them and making them discoverable through full text search, but there was resistance to this from copyright holders. The most important thing the publishers could do is reinstitute the practice of registering copyrights so people could find copyright holders to negotiate deals. If we don't solve those problems, the 22nd Century is going to wonder about us, because they won't have our tweets, emails and blogs. But these will be evaporated for all the reasons I described and it's like a black hole. Some people will save stuff and that's preservation by accident but I want to have a plan for preserving stuff. Some people will say most isn't worth preserving and I will probably agree with that. But some will say I really want to save this for my grandchildren and they should have the tools to do that. We can't offer them those tools yet because many don't exist.
If you could give guidance to any Engineer about how they position their careers what would you tell them based upon AI, what will the need be?
AI is not going to do what designers and engineers can do, which is invent, analyze, systems analysis,  but they can help as tools. If I have a big system and I'm trying to understand it, the AI tools are going to help with that. They aren't going to do systems architecture and engineering that you need. Giving advice to an engineer, in addition to developing their technical skills I would recommend they become good sales people. If you don't know how to sell your ideas forget about doing anything big. I learned I wasn't going to do anything big unless I learned how to sell people on doing what I wanted them to do. A lot of engineers just don't get that. The Internet wasn't just Bob Kahn and me, it was a whole bunch of people who decided they wanted to make that happen. We said “this is interesting and this is something you want to be a part of”.
How do you see the large publishers like Microsoft, Amazon, Google, Facebook evolving and who wins the race?
I think the most important thing is to recognize what technology is driving industry and to correctly recognize that and respond in time. Google realized mobile was becoming extremely important and that we needed to move our advertising focus to mobile we had to do something to make the mobile environment more friendly, which is what we did with Android. We made that available to everybody, and now there are more than a billion devices using the Android operating system. We shifted from mobile first to artificial intelligence first last year. Recognizing a trend before it becomes a trend is important and figuring out what to do about it. So the question is what is next after AI. There is something. The thing after AI is self-organization. I think self-organizing systems will be the next really hard nut to crack.

[/vc_column_text][/vc_column][vc_row][vc_column][vc_column_text][vc_cta_button2 h2="" title="LEARN MORE" color="belizehole" accent_color="#ed884e" link="url:http%3A%2F%2Fwww.managedsolution.com%2Fmeet-the-c-level-interview%2F|title:managedsolution.com|"]

MEET THE TECH EXEC INTERVIEWS

Managed Solution is conducting interviews as part of an outreach initiative to share trends and engage technology enthusiasts in the southwest.

[/vc_cta_button2][/vc_column_text][/vc_column][/vc_row]

What is Data Management?

What is Data Management?

By Ben Ward
There is a lot of talk around data management for businesses. In today’s ever changing digital world, businesses are accumulating far more data (even unconsciously) on consumers than ever before. It’s becoming critical for businesses, regardless of size to manage the data collected for analysis to decipher specific consumer habits or trends to ultimately increase profits or reduce cost.

How Can Data Management Help My Business?

Consumer Product Assessment and Feedback

Imagine this scenario:
You broadcast the new release of a product on social media, your friends, family and consumers are all digitally reacting to the launch of this new product using actions such as Like, Comment and Share. Do you have time to sift through the comments section and pull out specific keywords left in the comments to identify how the product is being perceived by consumers?
By utilizing data management, a business can set up metrics and key performance indicators on the data gathered through social media and perform analysis to decipher whether the product is being positively or negatively perceived by consumers. Proper data management can assist business comprehend how target consumers perceive products or services.

Allocation of Marketing Funds

In this day and age, any business that has a website really should be utilizing the power of any analytics platform. If you are one of these businesses, then you’re already one step ahead! Data management occurs in analytic platforms such as Google Analytics, however if you don’t know how to act upon the data displayed inside of your analytics platform, then what’s the point in even having it? Let’s take the Referrals section of Google Analytics for example, Google Analytics does a fantastic job of displaying the channel that each of your referral visitors come from. Let’s hypothetically say that your business purchased ads on Facebook and Twitter to announce the launch of your new product. After analyzing the number of referrals received directly from the ads hosted on Facebook and Twitter, we see that Facebook produced far more referrals that directly lead to product sales than Twitter. In this case, we would look to reduce our marketing spend with Twitter and allocate that money to Facebook advertising.

New Revenue Streams

Adding to the example above, Google Analytics may even display referrals from an organic source that you were not aware of which could contribute to a surprisingly large amount of sales. You may find that more people purchase products from your site when they search on Microsoft’s Bing search engine instead of Google. This type of information would not be easily accessible or available without a proper form of data management.

[vc_row][vc_column][vc_cta_button2 h2="Find Your Best Path To A Truly Consistent Hybrid Cloud" title="COST BENEFIT ANALYSIS" size="lg" position="bottom" link="url:http%3A%2F%2Fwww.managedsolution.com%2Faws-azure-compare%2F||" accent_color="#f4c61f"]

Achieve IT infrastructure cost savings of at least 50%

Call Southern California’s most trusted name in cloud at 800-208-3617 for real time pricing and a cost benefit analysis for Microsoft’s Azure and Amazon’s AWS.

[/vc_cta_button2][/vc_column][/vc_row]

3 Reasons to Use an MSP for Remote Server Monitoring

[vc_row][vc_column][vc_column_text]

3 reasons remote server - managed solution

3 Reasons to Use an MSP for Remote Server Monitoring

Is your IT department overworked and outstretched?A Managed Services Provider (MSP) can provide cloud-based remote server monitoring to reduce workloads free up your IT resources.  In today's business environment, keeping a network running smoothly is never easy, especially when one IT department is trying to manage every project.  Here are three reasons to use a Managed Services Provider for remote server monitoring:

1) Cut costs

With options like Pay-As-You-Go, a cloud-based remote server monitoring can save you money while allowing you to scale your company up and down as much as you'd like. In addition, you won't need to hire expensive, highly specialized IT administrators to run your networks. A Managed Services Provider can do all this for you, making outrageous IT expenses a thing of the past.

2) Managed on-the-go

Get alerts and real-time updates about your network from your MSP. So now while you are on the road to a business meeting or conference, you can check on your network's activity without missing a beat. Stay in the loop no matter what by viewing your network at a glance from your mobile device. Just don't check your network and drive.

3) No-hassle setups

It only takes a few minutes for each workstation or server to be up and running with cloud-based server monitoring.  Even better, there is no user downtime throughout the whole process, so your business can keep on running while everything is being setup.  Worry-free installations can give you time to focus on your business while still having peace of mind that there won't be any software compatibility or vulnerability issues.

[/vc_column_text][/vc_column][/vc_row][vc_row font_color="#ffffff" css=".vc_custom_1471641930410{background-color: #6994bf !important;}"][vc_column][vc_column_text css_animation="appear"]

Learn more about managed services provided by Managed Solution


[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]

Call us at 800-790-1524

[/vc_column_text][/vc_column][/vc_row]