Stadia and the Future of Cloud Gaming (Thought’s from a Network/Systems Administrator)

This morning Google announced and talked about Stadia, Google’s new video gaming streaming platform at GDC2019.

I lightly talked about game streaming about five months ago when XBox announced some details about their Project xCloud and Google wrapped up their public testing of Assassins Creed Odyssey being able to played in Chrome in my podcast.

As I said in my podcast, I’m not totally surprised that this is happening. Google has the key components to make this happen (as does Microsoft and possibly Amazon):

Controller Connected to the Data Center – Brilliant.

My guess is that Google skipped having a traditional hardware client (i.e. console) because of latency. In order to get that quick sub ten second load time, being able to move from device to device, and to reduce input lag, they’ve made the controller the “console” for all intensive purposes. No need to send the input from the controller to the console, then to the game instance in the DC. You go right to the cloud.

If the players inputs are being sent to the DC games, will feel snappier and not rely on the computing power of the target device. This probably isn’t an issue when playing on a desktop or laptop since those systems have extra processing power to handle user inputs without lag. But on a phone or Chromecast this helps in delivering a smoother experience for the player.

Future Proofing

Stadia being announced at GDC2019

As mentioned in the Stadia announcement video, the Project Stream resolution was 1080p 60fps, Stadia at launch will support up to 4K resolutions with HDR at 60fps, with future support up to 8K and 120+ fps. Not shocking.

To achieve the future performance numbers, the user would only need to make sure their device supports displaying the resolution, frames per second, and can connect to the service to receive the stream of the game. No more minimum/recommended PC or console specifications. I would imagine the game would scale depending on the device you’re playing on.

On the back-end, Google would either need to cluster (group) more individual servers (instances) to be able to render those higher resolution images at higher frequency (fps) or install servers with more raw computing power. The delivery aspect is trivial to Google, they have the bandwidth and processing power to deliver games at these qualities and possibly to other display tech such as VR, think about that for a second.

Unlimited* Processing

Now that gaming will be moving back to the early days of computing where terminals connected to mainframes that did the actual computing work. Many workplaces would have low powered system connected to a server that allowed multiple employees to share computing power to work. This sounds familiar…

Stadia takes this model of computing and connects a player to a server that a user(s) play on, what’s exciting is that this can infinitely scale to potentially the entire computing power of Google’s data center infrastructure. This allows developers to not be limited by what GPU/CPU is out on the market, nor does this limit players to being on a specific OS. The quality of the games would be only limited by the available processing power Google or the game dev’s own limitations, which my guess would probably be the cost to run the game on the Stadia platform and how much computing resources Google allows on a per game/user basis.

*In regards to “Unlimited” there’s always a limit, I doubt Google or any other provider would allow a game to have access to as much processing power as possible on a per user basis.

Next Door Neighbors

Network Rack of Servers

One can even imagine having developers and publishers having their game servers literally in the next cabinet, in the data center, to where the games are being played. The lag and ping issues that plague multiplayer games can be almost a thing of the past since the instance where the game is played is connected to the game server by a high speed 10 gbps+ fiber optic connection.

For players with sub optimal connections, high ping times, or game servers in another region, this could be big and allow those gamers to play the games on an equal playing field. Also, leveraging the way GCP, Azure, and AWS are set up, setting up a regional game server for players in different regions would be simple as cloning code and configurations to spin up servers in these new regions.

The Kick-start to the Next Generation of Internet Speeds for Consumers?

These days gigabit speed home networking equipment is readily available at Best Buy and local consumer electronics stores. Internet access speeds of at least 100mbps up and down are available in areas in the United States and faster speeds around the world.

At this point in time there really hasn’t been a need to have a ridiculously fast connection to the internet to browse the web, watch Netflix, or YouTube videos. Those with higher tier connections are businesses and enthusiasts. What happens when customers want the latest games that will only be available on Stadia, Project xCloud, and others and ISPs only offer high latency sub 50mbps/5mbps connections?

Customer demand will force ISPs to offer new services, plans, and technologies to meet these demands. Additionally, with wide adoption of 5G wireless speeds to be available in the next five to ten years AND these AAA streaming platforms being available on phones and tablets, wired ISPs will have competition to keep customers with home wired Internet service and not lose them to wireless ISPs. Expect ISPs in the US to be upgrading their networks in the coming years. Additionally, make sure to contact your representative in Congress to support net neutrality and groups like the EFF.

Imagine playing a game on Stadia with Google Fiber as your ISP… I’m pretty sure Google would do all sorts of prioritizing to give players the most direct connection to their DCs to deliver an amazing experiance.

– Kuro Kuma

Natural Progression

All this is a natural evolution of gaming and the Internet. Games have been bound by the limitations of a single PC/console, whether it be the CPU, GPU, or RAM. Games are always demanding the latest and greatest of what’s available.

Moving the processing and rendering of the game to the cloud where an unlimited number of systems can work in unison to render an image and send it to a player makes sense. The reason why gaming hasn’t done this yet is because of the lacking infrastructure for the players (broadband connections to the DCs) and more importantly the DCs (Googles DC locations) with edge locations (Googles) around the world able to quickly transfer the data.

Coupled with faster home and wireless Internet access, Google, Amazon, and Microsoft are in a place where the last decade plus of building networks and DCs crammed with computing power are now able to deliver this style of service and game play to the player.

I have a hunch that streaming games to any device will be the big story in the video gaming industry for 2019 and the next generation of consoles will support some form of early streaming service if not make it a main selling feature.

Leave a Reply

Your email address will not be published. Required fields are marked *