[vc_row css=".vc_custom_1501859913491{background-color: #e0e0e0 !important;}"][vc_column width="1/2"][vc_column_text]

To download the full magazine and read the full interviews, click here.

Todd is Vice President of Global Infrastructure and IT Operations for Western Digital, the world’s largest data storage company.  In this role, he is responsible for on-premise and cloud computing, global network, communications, data storage and data centers.  Additional responsibilities include support for high-performance, engineering, and manufacturing computing, as well as desktop productivity and end-user service and support.  Prior to Western Digital, he was responsible for IT Operations and Infrastructure for AMN Healthcare, Amylin Pharmaceuticals, and Siebel Systems.  Todd holds an MBA from the University of Georgia. [/vc_column_text][/vc_column][vc_column width="1/2"][grve_single_image image="17940"][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]What’s the #1 area of focus CIO’s should concentrate on?

CIOs are really the technology CEO of their company. The primary goal of a company is to attract and retain customers. Therefore, the CIO should be highly focused on the business strategy for attracting and retaining customers. Many executive peers remain technologically naïve – a CIO has to understand every part of the business, every function, and then be able to advise those functions on meeting those goals through technology. The most successful CIO will truly understand every business function.

What’s your take on Public Cloud?

WDC is a thought leader with public cloud, we are a very large customer of AWS, and we also hold the world’s record for the largest computing job ever run in AWS (we’re broken our own record several times). We are increasing our usage of Azure – Microsoft is really making progress making that service more relevant. Actually, Microsoft is a completely different company from 3 years ago, in a good way. Public cloud is one of several tools for corporate computing, but not one that solves every need – or will ever solve every need. This is a hybrid world.

What areas come to top of mind today when looking at Public Cloud?

Public Cloud isn’t cheap. Some things run cheaper on premise. Sometimes you have capital to spend, and public cloud is only OpX. Fit the right tool to the right job, this is a hybrid world.

Do you feel IT still carries the title of a cost center rather than revenue driver?

That solely depends on the CIO. WDC is a merged combination of WD, HGST, and SanDisk. 3 years ago, WD and HGST viewed IT solely as a cost center to be minimized. Now, our company views IT as a key enabler for marketing, sales, manufacturing, business intelligence, supply chain, and productivity. This change occurred because our CIO and his team set out to understand every facet of our business, and then focused on delivering innovation at every opportunity. “Keeping the lights on”, low-value work was moved to SaaS and cloud, allowing our internal resources to focus on innovation and delivering value.

What are you (the CIO) doing to support innovation in the company and its own organization to deliver better solutions?

Understanding and getting directly involved in every facet of the business, and then using that knowledge to craft innovative solutions that solve problems. Examples:
a. highly-complex, cloud engineered big data solutions leading to improved manufacturing yields and improved product quality
b. SaaS hosted applications that allow people to be productive, anywhere on any device
c. Manufacturing application consolidation and virtualization leading to reduced cost and higher uptime for factories

We are hearing so much about the internet of things – what does or could the internet of things for your business look like?

We love the IoT, because all of that data ends up on our products! WDC now views itself as a key enabler of future progress. IoT will provide gigantic data sets – which our products will host – that will yield advancements in medicine, technology, economics, science, and every other field. The solution to every problem is somewhere in that data swamp, and we will be part of pulling knowledge from it. Internally, IoT has already been helping us for years in our manufacturing processes.

Are there hiring challenges based in the great economic we’re currently facing today?

We are having good success recruiting good people. The globalization of the labor market helps somewhat, but we are able to hire leaders in the US as needed.

Has the idea of using cloud changed your mindset of using outsourced/Managed Services?

Our preference for providing technology services is first to use SaaS, so we can get totally out of that business (email, HRIS, file sharing, etc). Then either cloud or on-premise depending on the ROI and circumstances. We find value in specific managed services, such as network monitoring/NOC services or system administration. There is a place for managed services, because cloud isn’t going to do it all.

If you could give guidance to any CIO, IT Manager Director about how they position their careers what would you tell them?

Know your technology – that’s a given – but you must know your business. Get involved with them, understand their goals and challenges, and seek to address them. Most people will accept the help when it comes from a spirit of partnership and mutual success. Failure to do so just leads to “shadow IT” – they will do your job themselves.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_row_inner css=".vc_custom_1526366999142{padding-top: 50px !important;padding-right: 50px !important;padding-bottom: 50px !important;padding-left: 50px !important;background-color: #ec884f !important;}"][vc_column_inner][grve_callout title="MEET THE TECH EXEC INTERVIEWS" heading_tag="h2" heading="h2" button_text="LEARN MORE" button_color="green" button_hover_color="white" button_link="url:http%3A%2F%2Fwww.managedsolution.com%2Fmeet-the-c-level-interview%2F|||" el_class="txt-white"]Managed Solution is conducting interviews as part of an outreach initiative to share trends and engage technology enthusiasts in the southwest.[/grve_callout][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row]

[vc_row gmbt_prlx_parallax="up" font_color="#ffffff" css=".vc_custom_1501859784808{padding-top: 170px !important;padding-right: 0px !important;padding-bottom: 190px !important;padding-left: 0px !important;background: rgba(55,82,161,0.66) url(https://managedsolut.wpengine.com/wp-content/uploads/2017/08/CIO-Interview-header-Managed-Solution-1.jpg?id=) !important;background-position: center !important;background-repeat: no-repeat !important;background-size: cover !important;*background-color: rgb(55,82,161) !important;}"][vc_column][vc_column_text]

MEET THE TECH EXEC

Alex Bates, Senior Director of Software Development, Aspen Technology

Former CTO of Mtell, Mtell was aquired by Aspen Technology

[/vc_column_text][/vc_column][/vc_row][vc_row css=".vc_custom_1501859913491{background-color: #e0e0e0 !important;}"][vc_column width="1/2"][vc_column_text]

To download the full magazine and read the full interviews, click here.
Alex brings a unique perspective to Mtell at the intersection of diagnostics, machine learning, and the IoT. He pursued his interest in adaptive learning systems as an undergraduate, performing DARPA-funded research in neural networks, vision, and biomedical diagnostics, authoring several peer-reviewed publications. In 2000 he jumped into the private sector, applying analytics on some of the world’s largest data warehouses at Teradata, a pioneer in parallel database technology. 
In 2006 Alex co-founded Mtelligence (now Mtell) to harness the deluge of sensor data in the industrial IoT, with a mission to create a world that doesn’t break down. Mtell’s machine learning platform is used to monitor global fleets of offshore drilling rigs, railroad engines, and process equipment, in effect creating a distributed immune system to protect equipment and personnel.  A principal architect of the system, Alex is lead inventor on several patents in the area of sensor networks and machine learning. Mtell was acquired in 2016 by AspenTech, the global leader in process optimization software.
Alex enjoys sharing experiences with current and future entrepreneurs and technologists, and participates in mentoring through Entrepreneur Organization (EO), Startup Leadership Program, and other organizations.  Alex received degrees in Mathematics and Computer Science with a concentration in neuroscience.

[/vc_column_text][/vc_column][vc_column width="1/2"][vc_column_text][vc_single_image image="17925" img_size="large" alignment="center"][/vc_column_text][/vc_column][/vc_row][vc_column][vc_column_text]

When you were a kid what did you want to grow up to be?
When I was around 7, I was 100% sure I wanted to be a ninja. I grew up in a lot of different places, Portland, Oakland, LA, NYC, Madison. Dad was a journalist, and would move the family around.
Are you leveraging the cloud?
There is a cloud initiative, and various products have a certain degree of cloud enablement and it's certainly a priority to support that. We will need to support a hybrid model, before the recent merger with Aspen we used Amazon Web Services, most of our cloud use was for pilots and proof of concepts and our big customers like Exxon Mobil would use big data centers, in the Industrial realm, more adoption of small to medium sized companies, enterprises are guarded. They can afford it but the attitude has kind of been, it's safe, but is it safe.
If you were stranded on a deserted island, what 3 things would you bring and why?
Water purifier, solar producing unit, and my laptop. The boring answer. I figure I could scrounge and produce food, maybe a fish hook, I could probably make one. I've seen enough reality shows, I'm an engineer so I could figure out how to make shelter. The non-boring answer would be a boat and water toys.
Do you feel IT still carries the title of a cost center rather than revenue driver? Do you see a shift as IT being a revenue driver?
For us, it's a little bit more of an enabler, especially with cloud computing and DevOps. With DevOps there's more support of software, so that has shifted from just IT staff, it's now intersected with engineering.
Big data analytics? How do you leverage that?
Some customers are more paranoid about hoarding their own data and others are more open to sharing it because they get greater benefits. We want anywhere from a couple months to a year, industrial plants, you are going to want to know seasonal variables. You don't need all 4 seasons but the more the better. We learn failure patterns, we can collect them across customers, so we don't need every customer to have those.
What superpower do you want most?
I've always been interested in AI, a superhuman intelligence. Ideally one that could expand without limits, a greater capacity beyond the brain.
When you think about delivering the best solution to your customers, what does that mean to you?
For us, our customers, we travel out and get a whole new appreciation for the challenges they face. You can meet them on a drilling rig, you get a whole new appreciation for what they face. That movie Deepwater Horizon that is out, their lives are on the line, head people are responsible for many lives, for us, our software is supposed to help reduce risks and predict catastrophic incidents. For us to deliver value, we have to have state of the art monitoring capabilities, and integrate with their work progress so things don't slip through the cracks, so we have to deliver value for improving equipment, those are some of our metrics.
Are there hiring challenges based in the great economic we’re currently facing today?
The A+ players are always highly sought after, companies go after high caliber individuals. Some people have this perception that great candidates they graduate and go to the Bay Area, there might be some truth to that but we are developing that locally. We are also hiring in Boston. We hire data scientists which is highly sought after, we compete with Amazon, IBM, always competing for talent.
Has the idea of using cloud changed your mindset of using outsourced/Managed Services?
Offloading a lot of the IT Support, I think that everyone is warming up to the idea in the industrial realm. Especially on the S-M size, no one can support that. For big data, the costs are still a little bit high, to truly host that in the public cloud. Some of the technologies are sort of abstract. The best scenario would be to use public cloud for elastic computing, scale it back down, figure out how to transfer the data. But I definitely think it's the wave of the future.
What does IOT mean to you?
That is the core of our business. Industrial IOT is what we offer. Our customers have lots of distributed assets that produce products or transform raw materials, we have all these connected devices, there are a lot of enabling factors, sensors and storing big data. Certainly it's fundamental on an industrial level.

[/vc_column_text][/vc_column][vc_row][vc_column][vc_column_text][vc_cta_button2 h2="" title="LEARN MORE" color="belizehole" accent_color="#ed884e" link="url:http%3A%2F%2Fwww.managedsolution.com%2Fmeet-the-c-level-interview%2F|title:managedsolution.com|"]

MEET THE TECH EXEC INTERVIEWS

Managed Solution is conducting interviews as part of an outreach initiative to share trends and engage technology enthusiasts in the southwest.

[/vc_cta_button2][/vc_column_text][/vc_column][/vc_row]

[vc_row gmbt_prlx_parallax="up" font_color="#ffffff" css=".vc_custom_1501859784808{padding-top: 170px !important;padding-right: 0px !important;padding-bottom: 190px !important;padding-left: 0px !important;background: rgba(55,82,161,0.66) url(https://managedsolut.wpengine.com/wp-content/uploads/2017/08/CIO-Interview-header-Managed-Solution-1.jpg?id=) !important;background-position: center !important;background-repeat: no-repeat !important;background-size: cover !important;*background-color: rgb(55,82,161) !important;}"][vc_column][vc_column_text]

MEET THE TECH EXEC

Vinton G. Cerf, Vice President and Chief Internet Evangelist, Google

[/vc_column_text][/vc_column][/vc_row][vc_row css=".vc_custom_1501859913491{background-color: #e0e0e0 !important;}"][vc_column width="1/2"][vc_column_text]

To download the full magazine and read the full interviews, click here.
At Google, Vint Cerf contributes to global policy development and continued spread of the Internet. Widely known as one of the "Fathers of the Internet," Cerf is the co-designer of the TCP/IP protocols and the architecture of the Internet. He has served in executive positions at the Internet Society, the Internet Corporation for Assigned Names and Numbers, the American Registry for Internet Numbers, MCI, the Corporation for National Research Initiatives and the Defense Advanced Research Projects Agency and on the faculty of Stanford University.
Vint Cerf sits on US National Science Board and is a Visiting Scientist at the Jet Propulsion Laboratory. Cerf is a Foreign Member of the Royal Society and Swedish Academy of Engineering, Fellow of the IEEE, ACM, American Association for the Advancement of Science, American Academy of Arts and Sciences, British Computer Society, Worshipful Company of Information Technologists, Worshipful Company of Stationers and is a member of the National Academy of Engineering.
Cerf is a recipient of numerous awards and commendations in connection with his work on the Internet, including the US Presidential Medal of Freedom, US National Medal of Technology, the Queen Elizabeth Prize for Engineering, the Prince of Asturias Award, the Japan Prize, the Charles Stark Draper award, the ACM Turing Award, the Legion d’Honneur and 29 honorary degrees.

[/vc_column_text][/vc_column][vc_column width="1/2"][vc_column_text][vc_single_image image="17635" img_size="large" alignment="center"][/vc_column_text][/vc_column][/vc_row][vc_column][vc_column_text]

What did you want to grow up to be when you were a kid?
By the time I was 10, I knew I was going to be a scientist of some kind. I just didn't know what kind. I was intensely interested in math and chemistry. When I was 10 it was 1953 and I know that sounds like just before the Civil War. I got a Gilbert Chemistry Set, and back then you got some really great stuff in the chemistry sets. We could make thermite grenades and other fun things. I got really excited about all that. Stuck with the math and science and chemistry all through high school, then when I went to Stanford, I fell in love with computing. I spent as much time as I could in computer science even though I took a Math Degree, so knew I was going to go into that, didn't know exactly how. I didn't even meet a computer until I was 15, which for 15 year-olds now would be really weird. Even worse, the one I met was made out of tubes as this was before transistors.
How did you meet Robert Kahn, the man you developed TCP/IP with?
 We met initially at UCLA. He was working for a company, Bolt Beranek and Newman, responsible for the design of the Arpanet Packet Switch. We are talking about 1968/1969 and I was at UCLA working on a PHD in computer science. I was hired to be the principal programmer for what became the Arpanet Network Measurement Center and the man who hired me was Leonard Kleinrock, one of the founders of Linkabit. Eventually we got Arpanet up and running with four nodes. In early 1970, Bob Kahn came out to kick the tires of this little four node Arpanet cause he had some theories that it was going to congest or run into other problems. His colleagues said that would never happen; that it would be like all the oxygen molecules going to the center of the room and everybody suffocating. As the network measurement guy, my job was to go write programs that would drive traffic into this network to see what the response was. We killed the network repeatedly with various traffic patterns. We had names for some of these. One of them was, "The Mexican Standoff". Two of the packets switches couldn't send to the other one because they were full of traffic and there was no place to store incoming traffic. There were a series of things like that that dictated changes to that initial design like internal congestion control.
So fast forward to 1972. I’m at Stanford now joining the faculty and Bob Kahn is joining DARPA, to start what became the "Internetting Program" which was initiated because the defense department at DARPA figured out they wanted to use packet switching for command and control. That means that computers are going to be in ships, airplanes, sea, and mobile vehicles and all we had done was connect computers in fixed locations in air conditioned rooms. So that was not going to serve for command and control and that also means voice, video and data. So Kahn is developing a mobile packet radio system and packet satellite system. So guess which company is responsible for doing system engineering for the packet satellite network? Linkabit. So that's how I met Irwin Jacobs, while this "Internetting Program" was underway.
Bob Kahn ran those programs from 1972-1976 and at that point ARPA asked me to leave Stanford and come to Washington and run the Internetting program, packet radio and satellite program. I used the Arpanet network as part of the Internet system.
On November 22, 1977, we had the first demonstration of a 3 Network Internet and had a packet radio van going up and down the Bayshore freeway, radiating packets into the packet radio net through a gateway into the Arpanet, all the way across the US through an internal satellite link into Norway and down to University College London. And then the packets popped out of the Arpanet in the UK and went into the packet satellite network and went back across the Atlantic, and down to West Virginia then went back into the Arpanet and all the way down to Los Angeles. So we're moving packets from the moving vehicle in the Bay Area down to USC and of course they've gone a hundred thousand miles because of the internal Arpanet synchronous satellite hop up and down and then another satellite hop on the packet satellite network and they've gone back and forth across the Atlantic. And it worked. Of course I was hopping up and down because my God, it worked. It's a miracle. It's always a miracle when software works. But it demonstrated that TCP/IP protocols actually worked in a multi-network environment.
The thing that made it fairly dramatic was that the networks were so different. In the packet radio network the connectivity was changing all the time and there were different error rates and delays. Packet satellites were synchronous so there is a much longer propagation delay. The packets of the Arpanet were smaller and running at 50 kilobits in the backbone. That was a major milestone.
Is Cerf's Up producing the results you anticipated? What are your plans for the future of Cerf's Up, what's next for 2018?
This is our first experimental initiative in San Diego. Ann Kerr proposed it. We planned roundtable discussions with a mix of people with expertise in specific technologies and research topics. What we discovered is that you have plenty of startups that have a better survival rate than many other places do but they don't grow very much. There are several possible reasons for that, they get acquired, or getting A round capital is still turning out to be a challenge. There is a lot of debate whether San Diego has to go out to Silicon Valley to get money for A rounds and the answer is probably yes. There is a certain amount of willingness to take risk in Silicon Valley Venture companies, that doesn't occur here. You don't yet have as many generations of startup successes compared to the Silicon Valley. When you look at these local entrepreneurs and somebody offers them $50M for the company, they'll take it. On the other hand, if you've been through this a few times and somebody offers you $50M and you think you can grow it to a billion you don't take it. When you get a few more cycles of entrepreneurs who are willing to wait, that dynamic might just change with time.
I threatened to come back in 6 months to find out what happened as a result of CERF’s Up and we should focus on including the Biotech companies next time. You have plenty of resources to make very big successes and haven't so far except for a few cases so far like Qualcomm, and they had a long time to get there. One conclusion is that it is still in the early days for the startup engine to run enough cycles through for people to take more risks. The people sitting around the table at Cerf's Up were comfortable going up to Silicon Valley although they wish they didn't have to. If it was around the corner it would be easier. It's easy for somebody to go get them. It would be nice if they had that capability. Eventually you might actually get there.  While San Diego start ups have plenty of technical talent, they don’t always have management talent for marketing or finance. They sometimes don't appear to have all the skill sets they need to grow.
You have stated, "The real heart of successful business is innovation and that large companies must absorb new ideas". You also mentioned low engagement of employees and their untapped possibility and how unless you create an empowering environment, people will disengage. What do you think about Design Thinking as a movement to elevate teamwork?
I'm a big fan of the notion of design. I'm a huge prototyping fan. You don't know anything until you try it out. We went through 4 iterations of the Internet design. The freedom to iterate and try things out is super important. No matter how hard you work on something, you never get it right until it encounters the real world. The 4th Internet iteration was the final version in 1978 but we are seeing new iterations today such as IPv6.
You learn more from failure than you do from success. If I had to choose between a CEO who has failed before or one who never has, I'd choose the CEO who failed, unless they failed all the time, because they learned from that failure. Design is at the heart of everything. Architecture is at the heart of the design with how components interact with each other. I love to get at the whiteboard, or  if I can get the design on one piece of paper and see all the parts and can actually visualize their interactions, that is heaven for me. Especially as the implementation goes on. If somebody runs over and says what if we do X, I'll get out my piece of paper and if you don't have it all on one piece of paper you won't be able to figure it out. I'm oversimplifying and not all design can be done on one piece of paper but for me it's an important objective to figure out how it all works in one view. 
Why can’t we figure out a way to write software without bugs?
The answer is we need much better software tools, to minimize the mistakes we make. I think that should be a big research project for universities. It's much more important now than ever because we are so dependent on software doing things for us that we aren't aware of. The worse part, is even if the software works as we intended it to it may encounter a situation that we didn't think about. This happens a lot in a distributed environment, with things that never interacted before. Your software may be confronted with conditions it wasn't designed to cope with. If it fails in some harmful way as a result then that is a bug too. The implication is that all the software does need to have some path for updating. And there are all kinds of challenges here to prevent the bad guy from sending the new (deliberately harmful) update. You must verify the update is coming from a legitimate source. We lack tools that help us expose mistakes in our software implementations
What are your predictions of the Internet over the next 5-10 years and beyond?
Right now, the Internet is globally penetrated at 50% (about 3.5 billion people online). I estimate that by 2020 we will hit at least 70% or even 75% and a lot of that will be on the back of smart phones with increased capacity for 3 or 4 G. Generally, the speeds of access and underlying carriage on the net are going to increase over time.  We are way far away from reaching any limits. 100 gigabits per second backbones will be running at 400 gigs to a Terabit by 2022. You will also discover cloud based systems have become the dominant part of computing for most people who are using the Internet, as opposed to computation at the edges, and that will actually turn out to be beneficial because security will turn out to be better. Your access to capacity will be much more flexible because you will have less capital expenditure. You will use more service based models. That will continue. There will be new access opportunities available by way of satellites to other areas like Africa and Latin America. The Internet of Things will theoretically grow pretty dramatically on the assumption that it doesn't become inadequately secure or too complicated to use.
Security is a huge issue for IoT, we've already seen one specific example of the failure to secure simple devices like webcams, when most of these devices had zero security or well known user names and passwords you couldn't change. The Bot Net herders found them and recruited them into a Botnet and didn't change their functionality. The webacms still did what they were supposed to do, but the data streams they were sending were re-aimed at a target in the Internet and there were cascade affects stemming from the consequence denial of service attack.
I'm spending a lot of my time insisting that people pay attention to security, access control, encryption, and strong authentication because even innocent sounding data could be hazardous to your safety. I have temperature sensors in every room in my house. Every 5 minutes they are sending data to a server, so it's my guess that if you had access to 6 months of that digital data you could figure out how often everyone that lives there comes and goes and whether they away.  So IoT will be with us. AI will not perhaps reach a crescendo but there will be enormously large numbers of tools, based on the idea of deep learning for pattern matching and other applications.
One example already exists. Take some of our (Google’s) Tensor Processing Units. We trained one of them to manage the cooling systems at one of our data centers. Normally, we would do this manually once a week and try to reduce our costs for operating the cooling pumps. We trained one of the TensorFlow systems to manage the data center cooling system and it saved 40% of costs because it was smarter and faster than a human being and in real time. Being able to use tools in the Cloud I think that will be very visible 5 years from now. Google and others will be offering that through API's. We do that now with our TensorFlow systems and GPU's and eventually with quantum computing. Healthcare is possibly going to change. Most of that will be analytic since we are looking for patterns. The whole notion of analyzing metabolites, as a way of discovering diseases is interesting. It takes a fair amount of analysis and a big database of things to compare the metabolites with. I didn't mention, augmented and virtual reality, but we will see that moving very quickly too.
The Interplanetary Internet has been in operation since 2004. This work began at the Jet Propulsion Laboratory in 1998. When the two rovers landed on Mars, in January 2004, the original design was supposed to transmit data directly from the surface of Mars back to Earth to the Deep Space Network Ibig 70 Meter dishes at 3 locations: Canberra, Australia, Madrid, Spain and Goldstone, CA near Barstow). The planned data rate which was available was 28 kilobits a second which is not much from a science point of view. It turned out when the radios were turned on they started to overheat. They didn't want to ruin the radio's. One of the guys said we have X-band radio onboard the rovers and we have X-band radios in the orbiters to map the surface of Mars to help us figure out where the rovers should go. They reprogrammed the orbiters and the rovers to be a store and forward network. Because the orbiters were closer to the rovers on Mars than they were to Earth you could get 128 kilobits between the rover and the orbiter. Because the orbiter was out of the atmosphere with bigger solar panels it could transmit at128 kilobit/second back to Deep Space Network.  So all the data from Mars has been relayed through store and forward network using new interplanetary protocols which we had been developing in anticipation of the need for that. We need really rich communication support for man and robotic space exploration.
By 2022, we will have launched additional missions to Mars, and those hopefully will be carrying Interplanetary protocols. What I hope will happen is other missions launched in the solar system will finish their scientific missions and be repurposed as nodes of an interplanetary backbone to support further exploration.  TCP/IP would not work at interplanetary distances so we had to rewrite the protocols. TCP/IP flow control was really simple: when you ran out of room you told the other guy to stop. Well if the other guy is 20 minutes away, at the speed of light, he's going to be transmitting for 20 minutes before he hears you say anything so all that data will be lost. We had to develop a new suite of protocols that we call Delay and Disruption Tolerant Networking also known as “DTN.” The disruption problem is that if you are trying to communicate with something on the surface of Mars and Mars is rotating, then you can't talk to it until Mars rotates back around again. TCP wasn't designed to solve long periods of disruption.
When Earth and Mars are closest together, we are 35 million miles apart and with the speed of light that's 3.5 minutes. When we are farthest apart in our orbits, it's 235 million miles and that's 20 minutes one way. 40 minutes round trip. So that doesn't work without these new protocols. I hope their use will be ongoing by the end of the 5-year period.
That doesn't count the other project which some of us are scratching our heads about and that is how to build a spacecraft to get to Alpha Centari in 100 years elapsed time. The current propulsion systems would take 65K years to get there, which is a long time, so we are looking at alternative propulsion systems, ion engines to get us up to about 20% the speed of light. And then we'll have to slow down when we get to the other end otherwise we'll get maybe one picture as we head further into space! Then we have to do autonomous navigation to get there.  Because you can't do midcourse corrections like we normally do in the Interplanetary systems. When we go to Mars or Jupiter, part of the way there, we check to see what trajectory looks like and we will change the instructions of the spacecraft to do midcourse corrections.
Imagine you have a spacecraft that is a light year away. And take a year to give it an instruction and it takes another  year to find out what happened. This is not exactly interactive. We have to do autonomous navigation. And then there is the thing that I care most about which is how do you generate a signal from 4.3 light years away that you can actually detect. I'm thinking how much power do I have and it can't be much because an interstellar space craft pay load can't be all that much. Suppose I could generate a 100 watt signal, but I could compress it down to ten to the minus fifteen seconds. That's a big spike that I might be able to detect that from 4.3 light years away except for one problem.  Even if it's a columnated laser it's going to beam spread. I'm going to have a very weak signal coming back cause of the spread. I’ll need to build a synthetic aperture receiver network to detect the signal and reintegrate.
Can you explain the digital dark age and how we can prepare?
Every digital object you ever met is at risk of disappearing because the bits went away or the reader of the mediuym went away or the software doesn't run anymore. There are 3 problems. One is technical: how do I consistently copy bits into new media that I can still read. How do I maintain the metadata, which identifies the encoding of this photograph and what it is a photograph of and how do I maintain the operation of old software if there is no new software that can interpret the coded information. It involves preserving operating systems and application code and the description of the hardware so I can emulate that in the cloud. Then there is the 3rd problem. What is the business model that allows you to preserve content over a long period of time.
Historically we had libraries and archives to hold a lot of this stuff in the medium they were created with. But most of those media were relatively long lasting. We have media that doesn't last long. CD ROM readers may die before discs do. 3 and 1/2 inch floppy disks still have bits on them. I finally found a floppy disk reader that I can plug into my USB port and can run it on my Mac. I found Word Perfect files but I didn't find any Word Perfect programs on my operating system that could interpret the bits. We have to start thinking a way through a regime that deals with the technical problems. There are a whole series of potentially legal barriers. Suppose you have a piece of text and suppose it is encoded as a Word Perfect file. I don't know what the patent situation is on Word Perfect. If you pull this object up and was created in 1982 do we know whether or not the software is now free for people to use. How the heck do you find that out? We have patent registries so you might be able to figure it out if a patent expired for a piece of software but it is relatively difficult. Copyright is a similar problem. Do I have the right to propagate a digital object and can I copy it. You will have to search the copyright records, but there may not be any, because you don't have to register any more. In 1976 Congress adopted the Berne Convention, authors own the copyright the instant they create a work. Sounds good unless you are somebody like me who wants to do something with this thing and you didn't register it so I can't find you. I don't know where you are or who you are so I can't do a deal. This is one problem we had with the Google Book Search Project.  Copyrights had not expired for works were out of print. We wanted to promote their visibility by digitizing them and making them discoverable through full text search, but there was resistance to this from copyright holders. The most important thing the publishers could do is reinstitute the practice of registering copyrights so people could find copyright holders to negotiate deals. If we don't solve those problems, the 22nd Century is going to wonder about us, because they won't have our tweets, emails and blogs. But these will be evaporated for all the reasons I described and it's like a black hole. Some people will save stuff and that's preservation by accident but I want to have a plan for preserving stuff. Some people will say most isn't worth preserving and I will probably agree with that. But some will say I really want to save this for my grandchildren and they should have the tools to do that. We can't offer them those tools yet because many don't exist.
If you could give guidance to any Engineer about how they position their careers what would you tell them based upon AI, what will the need be?
AI is not going to do what designers and engineers can do, which is invent, analyze, systems analysis,  but they can help as tools. If I have a big system and I'm trying to understand it, the AI tools are going to help with that. They aren't going to do systems architecture and engineering that you need. Giving advice to an engineer, in addition to developing their technical skills I would recommend they become good sales people. If you don't know how to sell your ideas forget about doing anything big. I learned I wasn't going to do anything big unless I learned how to sell people on doing what I wanted them to do. A lot of engineers just don't get that. The Internet wasn't just Bob Kahn and me, it was a whole bunch of people who decided they wanted to make that happen. We said “this is interesting and this is something you want to be a part of”.
How do you see the large publishers like Microsoft, Amazon, Google, Facebook evolving and who wins the race?
I think the most important thing is to recognize what technology is driving industry and to correctly recognize that and respond in time. Google realized mobile was becoming extremely important and that we needed to move our advertising focus to mobile we had to do something to make the mobile environment more friendly, which is what we did with Android. We made that available to everybody, and now there are more than a billion devices using the Android operating system. We shifted from mobile first to artificial intelligence first last year. Recognizing a trend before it becomes a trend is important and figuring out what to do about it. So the question is what is next after AI. There is something. The thing after AI is self-organization. I think self-organizing systems will be the next really hard nut to crack.

[/vc_column_text][/vc_column][vc_row][vc_column][vc_column_text][vc_cta_button2 h2="" title="LEARN MORE" color="belizehole" accent_color="#ed884e" link="url:http%3A%2F%2Fwww.managedsolution.com%2Fmeet-the-c-level-interview%2F|title:managedsolution.com|"]

MEET THE TECH EXEC INTERVIEWS

Managed Solution is conducting interviews as part of an outreach initiative to share trends and engage technology enthusiasts in the southwest.

[/vc_cta_button2][/vc_column_text][/vc_column][/vc_row]

Contact us Today!

Chat with an expert about your business’s technology needs.