Meet the Tech Exec: Vinton G. Cerf, Vice President and Chief Internet Evangelist, Google

[vc_row gmbt_prlx_parallax="up" font_color="#ffffff" css=".vc_custom_1501859784808{padding-top: 170px !important;padding-right: 0px !important;padding-bottom: 190px !important;padding-left: 0px !important;background: rgba(55,82,161,0.66) url(https://managedsolut.wpengine.com/wp-content/uploads/2017/08/CIO-Interview-header-Managed-Solution-1.jpg?id=) !important;background-position: center !important;background-repeat: no-repeat !important;background-size: cover !important;*background-color: rgb(55,82,161) !important;}"][vc_column][vc_column_text]

MEET THE TECH EXEC

Vinton G. Cerf, Vice President and Chief Internet Evangelist, Google

[/vc_column_text][/vc_column][/vc_row][vc_row css=".vc_custom_1501859913491{background-color: #e0e0e0 !important;}"][vc_column width="1/2"][vc_column_text]

To download the full magazine and read the full interviews, click here.
At Google, Vint Cerf contributes to global policy development and continued spread of the Internet. Widely known as one of the "Fathers of the Internet," Cerf is the co-designer of the TCP/IP protocols and the architecture of the Internet. He has served in executive positions at the Internet Society, the Internet Corporation for Assigned Names and Numbers, the American Registry for Internet Numbers, MCI, the Corporation for National Research Initiatives and the Defense Advanced Research Projects Agency and on the faculty of Stanford University.
Vint Cerf sits on US National Science Board and is a Visiting Scientist at the Jet Propulsion Laboratory. Cerf is a Foreign Member of the Royal Society and Swedish Academy of Engineering, Fellow of the IEEE, ACM, American Association for the Advancement of Science, American Academy of Arts and Sciences, British Computer Society, Worshipful Company of Information Technologists, Worshipful Company of Stationers and is a member of the National Academy of Engineering.
Cerf is a recipient of numerous awards and commendations in connection with his work on the Internet, including the US Presidential Medal of Freedom, US National Medal of Technology, the Queen Elizabeth Prize for Engineering, the Prince of Asturias Award, the Japan Prize, the Charles Stark Draper award, the ACM Turing Award, the Legion d’Honneur and 29 honorary degrees.

[/vc_column_text][/vc_column][vc_column width="1/2"][vc_column_text][vc_single_image image="17635" img_size="large" alignment="center"][/vc_column_text][/vc_column][/vc_row][vc_column][vc_column_text]

What did you want to grow up to be when you were a kid?
By the time I was 10, I knew I was going to be a scientist of some kind. I just didn't know what kind. I was intensely interested in math and chemistry. When I was 10 it was 1953 and I know that sounds like just before the Civil War. I got a Gilbert Chemistry Set, and back then you got some really great stuff in the chemistry sets. We could make thermite grenades and other fun things. I got really excited about all that. Stuck with the math and science and chemistry all through high school, then when I went to Stanford, I fell in love with computing. I spent as much time as I could in computer science even though I took a Math Degree, so knew I was going to go into that, didn't know exactly how. I didn't even meet a computer until I was 15, which for 15 year-olds now would be really weird. Even worse, the one I met was made out of tubes as this was before transistors.
How did you meet Robert Kahn, the man you developed TCP/IP with?
 We met initially at UCLA. He was working for a company, Bolt Beranek and Newman, responsible for the design of the Arpanet Packet Switch. We are talking about 1968/1969 and I was at UCLA working on a PHD in computer science. I was hired to be the principal programmer for what became the Arpanet Network Measurement Center and the man who hired me was Leonard Kleinrock, one of the founders of Linkabit. Eventually we got Arpanet up and running with four nodes. In early 1970, Bob Kahn came out to kick the tires of this little four node Arpanet cause he had some theories that it was going to congest or run into other problems. His colleagues said that would never happen; that it would be like all the oxygen molecules going to the center of the room and everybody suffocating. As the network measurement guy, my job was to go write programs that would drive traffic into this network to see what the response was. We killed the network repeatedly with various traffic patterns. We had names for some of these. One of them was, "The Mexican Standoff". Two of the packets switches couldn't send to the other one because they were full of traffic and there was no place to store incoming traffic. There were a series of things like that that dictated changes to that initial design like internal congestion control.
So fast forward to 1972. I’m at Stanford now joining the faculty and Bob Kahn is joining DARPA, to start what became the "Internetting Program" which was initiated because the defense department at DARPA figured out they wanted to use packet switching for command and control. That means that computers are going to be in ships, airplanes, sea, and mobile vehicles and all we had done was connect computers in fixed locations in air conditioned rooms. So that was not going to serve for command and control and that also means voice, video and data. So Kahn is developing a mobile packet radio system and packet satellite system. So guess which company is responsible for doing system engineering for the packet satellite network? Linkabit. So that's how I met Irwin Jacobs, while this "Internetting Program" was underway.
Bob Kahn ran those programs from 1972-1976 and at that point ARPA asked me to leave Stanford and come to Washington and run the Internetting program, packet radio and satellite program. I used the Arpanet network as part of the Internet system.
On November 22, 1977, we had the first demonstration of a 3 Network Internet and had a packet radio van going up and down the Bayshore freeway, radiating packets into the packet radio net through a gateway into the Arpanet, all the way across the US through an internal satellite link into Norway and down to University College London. And then the packets popped out of the Arpanet in the UK and went into the packet satellite network and went back across the Atlantic, and down to West Virginia then went back into the Arpanet and all the way down to Los Angeles. So we're moving packets from the moving vehicle in the Bay Area down to USC and of course they've gone a hundred thousand miles because of the internal Arpanet synchronous satellite hop up and down and then another satellite hop on the packet satellite network and they've gone back and forth across the Atlantic. And it worked. Of course I was hopping up and down because my God, it worked. It's a miracle. It's always a miracle when software works. But it demonstrated that TCP/IP protocols actually worked in a multi-network environment.
The thing that made it fairly dramatic was that the networks were so different. In the packet radio network the connectivity was changing all the time and there were different error rates and delays. Packet satellites were synchronous so there is a much longer propagation delay. The packets of the Arpanet were smaller and running at 50 kilobits in the backbone. That was a major milestone.
Is Cerf's Up producing the results you anticipated? What are your plans for the future of Cerf's Up, what's next for 2018?
This is our first experimental initiative in San Diego. Ann Kerr proposed it. We planned roundtable discussions with a mix of people with expertise in specific technologies and research topics. What we discovered is that you have plenty of startups that have a better survival rate than many other places do but they don't grow very much. There are several possible reasons for that, they get acquired, or getting A round capital is still turning out to be a challenge. There is a lot of debate whether San Diego has to go out to Silicon Valley to get money for A rounds and the answer is probably yes. There is a certain amount of willingness to take risk in Silicon Valley Venture companies, that doesn't occur here. You don't yet have as many generations of startup successes compared to the Silicon Valley. When you look at these local entrepreneurs and somebody offers them $50M for the company, they'll take it. On the other hand, if you've been through this a few times and somebody offers you $50M and you think you can grow it to a billion you don't take it. When you get a few more cycles of entrepreneurs who are willing to wait, that dynamic might just change with time.
I threatened to come back in 6 months to find out what happened as a result of CERF’s Up and we should focus on including the Biotech companies next time. You have plenty of resources to make very big successes and haven't so far except for a few cases so far like Qualcomm, and they had a long time to get there. One conclusion is that it is still in the early days for the startup engine to run enough cycles through for people to take more risks. The people sitting around the table at Cerf's Up were comfortable going up to Silicon Valley although they wish they didn't have to. If it was around the corner it would be easier. It's easy for somebody to go get them. It would be nice if they had that capability. Eventually you might actually get there.  While San Diego start ups have plenty of technical talent, they don’t always have management talent for marketing or finance. They sometimes don't appear to have all the skill sets they need to grow.
You have stated, "The real heart of successful business is innovation and that large companies must absorb new ideas". You also mentioned low engagement of employees and their untapped possibility and how unless you create an empowering environment, people will disengage. What do you think about Design Thinking as a movement to elevate teamwork?
I'm a big fan of the notion of design. I'm a huge prototyping fan. You don't know anything until you try it out. We went through 4 iterations of the Internet design. The freedom to iterate and try things out is super important. No matter how hard you work on something, you never get it right until it encounters the real world. The 4th Internet iteration was the final version in 1978 but we are seeing new iterations today such as IPv6.
You learn more from failure than you do from success. If I had to choose between a CEO who has failed before or one who never has, I'd choose the CEO who failed, unless they failed all the time, because they learned from that failure. Design is at the heart of everything. Architecture is at the heart of the design with how components interact with each other. I love to get at the whiteboard, or  if I can get the design on one piece of paper and see all the parts and can actually visualize their interactions, that is heaven for me. Especially as the implementation goes on. If somebody runs over and says what if we do X, I'll get out my piece of paper and if you don't have it all on one piece of paper you won't be able to figure it out. I'm oversimplifying and not all design can be done on one piece of paper but for me it's an important objective to figure out how it all works in one view. 
Why can’t we figure out a way to write software without bugs?
The answer is we need much better software tools, to minimize the mistakes we make. I think that should be a big research project for universities. It's much more important now than ever because we are so dependent on software doing things for us that we aren't aware of. The worse part, is even if the software works as we intended it to it may encounter a situation that we didn't think about. This happens a lot in a distributed environment, with things that never interacted before. Your software may be confronted with conditions it wasn't designed to cope with. If it fails in some harmful way as a result then that is a bug too. The implication is that all the software does need to have some path for updating. And there are all kinds of challenges here to prevent the bad guy from sending the new (deliberately harmful) update. You must verify the update is coming from a legitimate source. We lack tools that help us expose mistakes in our software implementations
What are your predictions of the Internet over the next 5-10 years and beyond?
Right now, the Internet is globally penetrated at 50% (about 3.5 billion people online). I estimate that by 2020 we will hit at least 70% or even 75% and a lot of that will be on the back of smart phones with increased capacity for 3 or 4 G. Generally, the speeds of access and underlying carriage on the net are going to increase over time.  We are way far away from reaching any limits. 100 gigabits per second backbones will be running at 400 gigs to a Terabit by 2022. You will also discover cloud based systems have become the dominant part of computing for most people who are using the Internet, as opposed to computation at the edges, and that will actually turn out to be beneficial because security will turn out to be better. Your access to capacity will be much more flexible because you will have less capital expenditure. You will use more service based models. That will continue. There will be new access opportunities available by way of satellites to other areas like Africa and Latin America. The Internet of Things will theoretically grow pretty dramatically on the assumption that it doesn't become inadequately secure or too complicated to use.
Security is a huge issue for IoT, we've already seen one specific example of the failure to secure simple devices like webcams, when most of these devices had zero security or well known user names and passwords you couldn't change. The Bot Net herders found them and recruited them into a Botnet and didn't change their functionality. The webacms still did what they were supposed to do, but the data streams they were sending were re-aimed at a target in the Internet and there were cascade affects stemming from the consequence denial of service attack.
I'm spending a lot of my time insisting that people pay attention to security, access control, encryption, and strong authentication because even innocent sounding data could be hazardous to your safety. I have temperature sensors in every room in my house. Every 5 minutes they are sending data to a server, so it's my guess that if you had access to 6 months of that digital data you could figure out how often everyone that lives there comes and goes and whether they away.  So IoT will be with us. AI will not perhaps reach a crescendo but there will be enormously large numbers of tools, based on the idea of deep learning for pattern matching and other applications.
One example already exists. Take some of our (Google’s) Tensor Processing Units. We trained one of them to manage the cooling systems at one of our data centers. Normally, we would do this manually once a week and try to reduce our costs for operating the cooling pumps. We trained one of the TensorFlow systems to manage the data center cooling system and it saved 40% of costs because it was smarter and faster than a human being and in real time. Being able to use tools in the Cloud I think that will be very visible 5 years from now. Google and others will be offering that through API's. We do that now with our TensorFlow systems and GPU's and eventually with quantum computing. Healthcare is possibly going to change. Most of that will be analytic since we are looking for patterns. The whole notion of analyzing metabolites, as a way of discovering diseases is interesting. It takes a fair amount of analysis and a big database of things to compare the metabolites with. I didn't mention, augmented and virtual reality, but we will see that moving very quickly too.
The Interplanetary Internet has been in operation since 2004. This work began at the Jet Propulsion Laboratory in 1998. When the two rovers landed on Mars, in January 2004, the original design was supposed to transmit data directly from the surface of Mars back to Earth to the Deep Space Network Ibig 70 Meter dishes at 3 locations: Canberra, Australia, Madrid, Spain and Goldstone, CA near Barstow). The planned data rate which was available was 28 kilobits a second which is not much from a science point of view. It turned out when the radios were turned on they started to overheat. They didn't want to ruin the radio's. One of the guys said we have X-band radio onboard the rovers and we have X-band radios in the orbiters to map the surface of Mars to help us figure out where the rovers should go. They reprogrammed the orbiters and the rovers to be a store and forward network. Because the orbiters were closer to the rovers on Mars than they were to Earth you could get 128 kilobits between the rover and the orbiter. Because the orbiter was out of the atmosphere with bigger solar panels it could transmit at128 kilobit/second back to Deep Space Network.  So all the data from Mars has been relayed through store and forward network using new interplanetary protocols which we had been developing in anticipation of the need for that. We need really rich communication support for man and robotic space exploration.
By 2022, we will have launched additional missions to Mars, and those hopefully will be carrying Interplanetary protocols. What I hope will happen is other missions launched in the solar system will finish their scientific missions and be repurposed as nodes of an interplanetary backbone to support further exploration.  TCP/IP would not work at interplanetary distances so we had to rewrite the protocols. TCP/IP flow control was really simple: when you ran out of room you told the other guy to stop. Well if the other guy is 20 minutes away, at the speed of light, he's going to be transmitting for 20 minutes before he hears you say anything so all that data will be lost. We had to develop a new suite of protocols that we call Delay and Disruption Tolerant Networking also known as “DTN.” The disruption problem is that if you are trying to communicate with something on the surface of Mars and Mars is rotating, then you can't talk to it until Mars rotates back around again. TCP wasn't designed to solve long periods of disruption.
When Earth and Mars are closest together, we are 35 million miles apart and with the speed of light that's 3.5 minutes. When we are farthest apart in our orbits, it's 235 million miles and that's 20 minutes one way. 40 minutes round trip. So that doesn't work without these new protocols. I hope their use will be ongoing by the end of the 5-year period.
That doesn't count the other project which some of us are scratching our heads about and that is how to build a spacecraft to get to Alpha Centari in 100 years elapsed time. The current propulsion systems would take 65K years to get there, which is a long time, so we are looking at alternative propulsion systems, ion engines to get us up to about 20% the speed of light. And then we'll have to slow down when we get to the other end otherwise we'll get maybe one picture as we head further into space! Then we have to do autonomous navigation to get there.  Because you can't do midcourse corrections like we normally do in the Interplanetary systems. When we go to Mars or Jupiter, part of the way there, we check to see what trajectory looks like and we will change the instructions of the spacecraft to do midcourse corrections.
Imagine you have a spacecraft that is a light year away. And take a year to give it an instruction and it takes another  year to find out what happened. This is not exactly interactive. We have to do autonomous navigation. And then there is the thing that I care most about which is how do you generate a signal from 4.3 light years away that you can actually detect. I'm thinking how much power do I have and it can't be much because an interstellar space craft pay load can't be all that much. Suppose I could generate a 100 watt signal, but I could compress it down to ten to the minus fifteen seconds. That's a big spike that I might be able to detect that from 4.3 light years away except for one problem.  Even if it's a columnated laser it's going to beam spread. I'm going to have a very weak signal coming back cause of the spread. I’ll need to build a synthetic aperture receiver network to detect the signal and reintegrate.
Can you explain the digital dark age and how we can prepare?
Every digital object you ever met is at risk of disappearing because the bits went away or the reader of the mediuym went away or the software doesn't run anymore. There are 3 problems. One is technical: how do I consistently copy bits into new media that I can still read. How do I maintain the metadata, which identifies the encoding of this photograph and what it is a photograph of and how do I maintain the operation of old software if there is no new software that can interpret the coded information. It involves preserving operating systems and application code and the description of the hardware so I can emulate that in the cloud. Then there is the 3rd problem. What is the business model that allows you to preserve content over a long period of time.
Historically we had libraries and archives to hold a lot of this stuff in the medium they were created with. But most of those media were relatively long lasting. We have media that doesn't last long. CD ROM readers may die before discs do. 3 and 1/2 inch floppy disks still have bits on them. I finally found a floppy disk reader that I can plug into my USB port and can run it on my Mac. I found Word Perfect files but I didn't find any Word Perfect programs on my operating system that could interpret the bits. We have to start thinking a way through a regime that deals with the technical problems. There are a whole series of potentially legal barriers. Suppose you have a piece of text and suppose it is encoded as a Word Perfect file. I don't know what the patent situation is on Word Perfect. If you pull this object up and was created in 1982 do we know whether or not the software is now free for people to use. How the heck do you find that out? We have patent registries so you might be able to figure it out if a patent expired for a piece of software but it is relatively difficult. Copyright is a similar problem. Do I have the right to propagate a digital object and can I copy it. You will have to search the copyright records, but there may not be any, because you don't have to register any more. In 1976 Congress adopted the Berne Convention, authors own the copyright the instant they create a work. Sounds good unless you are somebody like me who wants to do something with this thing and you didn't register it so I can't find you. I don't know where you are or who you are so I can't do a deal. This is one problem we had with the Google Book Search Project.  Copyrights had not expired for works were out of print. We wanted to promote their visibility by digitizing them and making them discoverable through full text search, but there was resistance to this from copyright holders. The most important thing the publishers could do is reinstitute the practice of registering copyrights so people could find copyright holders to negotiate deals. If we don't solve those problems, the 22nd Century is going to wonder about us, because they won't have our tweets, emails and blogs. But these will be evaporated for all the reasons I described and it's like a black hole. Some people will save stuff and that's preservation by accident but I want to have a plan for preserving stuff. Some people will say most isn't worth preserving and I will probably agree with that. But some will say I really want to save this for my grandchildren and they should have the tools to do that. We can't offer them those tools yet because many don't exist.
If you could give guidance to any Engineer about how they position their careers what would you tell them based upon AI, what will the need be?
AI is not going to do what designers and engineers can do, which is invent, analyze, systems analysis,  but they can help as tools. If I have a big system and I'm trying to understand it, the AI tools are going to help with that. They aren't going to do systems architecture and engineering that you need. Giving advice to an engineer, in addition to developing their technical skills I would recommend they become good sales people. If you don't know how to sell your ideas forget about doing anything big. I learned I wasn't going to do anything big unless I learned how to sell people on doing what I wanted them to do. A lot of engineers just don't get that. The Internet wasn't just Bob Kahn and me, it was a whole bunch of people who decided they wanted to make that happen. We said “this is interesting and this is something you want to be a part of”.
How do you see the large publishers like Microsoft, Amazon, Google, Facebook evolving and who wins the race?
I think the most important thing is to recognize what technology is driving industry and to correctly recognize that and respond in time. Google realized mobile was becoming extremely important and that we needed to move our advertising focus to mobile we had to do something to make the mobile environment more friendly, which is what we did with Android. We made that available to everybody, and now there are more than a billion devices using the Android operating system. We shifted from mobile first to artificial intelligence first last year. Recognizing a trend before it becomes a trend is important and figuring out what to do about it. So the question is what is next after AI. There is something. The thing after AI is self-organization. I think self-organizing systems will be the next really hard nut to crack.

[/vc_column_text][/vc_column][vc_row][vc_column][vc_column_text][vc_cta_button2 h2="" title="LEARN MORE" color="belizehole" accent_color="#ed884e" link="url:http%3A%2F%2Fwww.managedsolution.com%2Fmeet-the-c-level-interview%2F|title:managedsolution.com|"]

MEET THE TECH EXEC INTERVIEWS

Managed Solution is conducting interviews as part of an outreach initiative to share trends and engage technology enthusiasts in the southwest.

[/vc_cta_button2][/vc_column_text][/vc_column][/vc_row]

Continued Reading

How to Export and Import Solutions Between Dynamics 365 Instances

How to Export and Import Solutions Between Dynamics 365 Instances […]

LEARN MORE

Managed Solution CEO, Susan Kuruvilla featured in the 10th Anniversary Issue of San Diego Woman Magazine

[vc_row][vc_column][vc_column_text] Learn more about Susan Kuruvilla, CEO of Managed Solution, […]

LEARN MORE

Contact us Today!

Chat with an expert about your business’s technology needs.