The Factory Times is the Student-Run school newspaper for SUNY Poly.

AI and the Cloud

AI and the Cloud

Though it’s possible our readers haven’t heard much about artificial intelligence, you’d have to be living under a rock if you’ve never heard of ‘the cloud,’ let alone how rare it would be if you had never interacted with one. Both of these concepts can be so far removed from the average user’s daily life. Yet some of the more modern telephones carry technology inside them that enables some of the most basic artificial intelligence. Almost all Android and Apple powered phones, by comparison, keep our settings, backups, photos, and all sorts of information synchronized to one or multiple cloud services. When I write about artificial intelligence, I have to acknowledge the major players at the moment are Intel, IBM, and Google. Likewise, when bringing up cloud service providers, Google comes up again, along with Apple, Microsoft, and Amazon Web Services. 

 For the moment, I’d like to address Google’s intersection in both of these growing areas of technology. Inside my Pixel 3, running on the Google Fi network, there’s a processor specifically designed by Google to take advantage of an instruction set called TensorFlow. On the Pixel 2 and 3, speech data is processed partially by the on-board TensorFlow chip and partially by the geographically closest Google data center, which requires a fairly robust Internet connection. TensorFlow helps process machine learning algorithms that enable speech and visual processing. 

 After all, isn’t that what we need artificial intelligence for, so a computer can process visual and audio input into meaningful digital instructions? Speaking of visual processing, since the Pixel 2, a specialized chip known as the Pixel Visual Core has also been a part of the overall design, enabling new algorithms such as that used by Night Shot, where the processor interpolates primary color information and extrapolates beautiful color photos out of extremely low light scenes. The algorithm itself is available for free as an open source library, but the Pixel Visual Core greatly enhances the speed at which the data is able to be processed. 

 Pixel 4 phones have a successor to the Visual Core chip that Google calls Pixel Neural Core. This upgraded chip can process both audio and visual information. With the Pixel Neural Core, Google has concentrated more AI learning functions into the handset, relying less on the traditional high-speed connectivity with its cloud based TensorFlow data processing infrastructure. That’s not to say Google Assistant will always work without an Internet connection, but it certainly helps for processing the most basic instructions in less-than-optimum coverage areas. 

 Believe it or not, AI helps decide what files stay on your phone or get offloaded onto cloud storage accounts, as any user of the Google Photos application could tell you. Regular backups of photos and videos into Google’s Cloud give me some assurance that even if I break my phone, my photos and settings will be downloaded into the replacement. My phone doesn’t delete the local copies without asking me. It also asks me if I’d like to delete memes that I have saved to its local storage, even though I have a huge amount of storage left. 

 In addition to Google Photos, I take full advantage of Google Drive, Docs, Sheets, and Slides on their consumer platform, all of which is cloud based. As a student at the SUNY Polytechnic Institute, I also enjoy two academically centered cloud experiences, delivered by Google and Microsoft. These cloud services let me work and study from almost any computer or mobile phone. They even go so far as to host my weekly articles and allow me to share them with my editor, who then accesses the now cloud based document to add comments. Comments synchronize, through the magic of the cloud, to all my PCs and every local copy of my article, where I address any typos and resubmit. 

 The cloud allows us to create, share, edit, and publish while artificial intelligence is still just trying to learn what all of it means. We force-feed our phones all sorts of data from the real world, hoping somewhere inside an algorithm or chip design, that the data will suddenly become meaningful to the system built around it. For now, we are an integral part of that system because of how poorly we’ve managed to coax computers into processing that data. But the better these chips get at processing audio and visual information; the less involved humans will be in the interpretation of meaningful data.  

 So, with that, I’m afraid I have no good news! Robots are officially on their way to stealing human jobs in fields such as security because of these AI and cloud advances. The need for humans to monitor both physical security and cyber security is just the first to be optimized, read downsized, by modern cloud and AI efficiency. The only way to stay afloat is to get ahead of the technological curve and learn how to service and maintain our future robotic overlords. 

The MCAS System  Issues aboard Boeing 737 Max Planes

The MCAS System Issues aboard Boeing 737 Max Planes

The Story of Faze Jarvis

The Story of Faze Jarvis