Twitch Data Loss Shows the Time for Data Governance is NOW

We have all probably heard about the recent news on the Twitch data breach.  Yes, it is a huge deal, and the hack has received a ton of press.  

But I am amazed that the thing people are focusing on most is how much money the top streamers are making on a monthly or yearly basis. 

Sensitive Data & Source Code Leaked at Scale

The fact that a platform that has well over 100 million views a month and is owned by an industry giant named after a river had this event.  I mean come on people we are missing the boat on this one. 

I 100% support free enterprise but the fact that this event happened due to a server misconfiguration should be the focus. This simple mistake allowed sensitive data and source code to leave the virtual building at a massive scale.

Here is a direct quote from Twitch on their news page:

 As we said previously, the incident was a result of a server configuration change that allowed improper access by an unauthorized third party. Our team took action to fix the configuration issue and secure our systems.”

This short paragraph brings up two major topics.  

The first one is that a properly configured system does not always stay properly configured. There are a million things that could happen that could result in an unforeseen configuration change but it usually goes back to human error. 

The second is that you do not always know what third-party system your applications are connected to.  If you think you have complete visibility into your systems, you are kidding yourself. You may know today but what about tomorrow? 

These two issues map directly to the concept of drift, something changed and nobody knew about it.

Data Governance isn’t optional for CI/CD

With aggressive CI/CD pipelines change is constant.  Whether the changes are configuration, infrastructure, or code the threat of a mistake is real. 

Don’t just look for the known issues be aware of what potential issues may exist after a simple configuration change.  Managing data governance should be a continuous process not something you do a few times a year.

Final Analogy

It is widely known that one of the major reasons the Titanic hit the iceberg in 1912 and sunk was the need for speed.  The ship was going way too fast in the dangerous iceberg-filled waters. The reason for this speed was to impress the world with how amazing this technologically advanced ship was. When an iceberg was spotted it could not stop in time.  

Don’t let your CI/CD pipeline go so fast that you can’t see the virtual iceberg in front of you.

Share on linkedin
Share on twitter
Share on email
Share on facebook

Learn More About Bionic

Datasheet

Make Applications Secure & Compliant

Developers push code into production every day, making it harder to visualize and manage cloud architectures. Bionic is agentless, making it easier to understand and prioritize risk in complex environments to ensure code and microservices are drift-free, secure, and compliant.

Case Study

Bionic Helped Large Financial Services Provider Modernize Its Applications

Video Series

Bionic Uncensored

The application security industry is changing. Bionic is going to be the company to do it. Watch Bionic Uncensored, where our Chief Architect, Matt Rose, breaks down application security one glass board session at a time.