Evgeny Chereshnev: Let's Secure Our Digital Data!

"There are approximately 2,000 networks that are tracking your data, analyzing, profiling people. It looks like data belongs to everybody but the users."

Video abspielen
Youtube

We're used to give away lots of our personal data to use apps and online services. But do we have a fair choice? Can we decide what data a company may get from us? No, in most cases it's an "all or nothing" decision, says Evgeny Chereshnev, Vice President of Kaspersky Lab. And he wants to change that. Our data should be dealt with like something we physically posess, like a car, for example.

Wir sind es ja gewohnt, unsere persönlichen Daten wegzugeben, um Apps oder Online-Dienste nutzen zu können. Aber haben wir eigentlich eine faire Wahl? Können wir darüber entscheiden, welche Daten genau ein Unternehmen von uns bekommt? Nein, meistens heißt es: "Friss oder stirb", meint Evgeny Chereshnev, Vizepräsident des Datensicherheitsunternehmens Kaspersky Lab. Und er will das ändern: Unsere Daten sollten so behandelt werden wie Gegenstände, die wir besitzen, wie unsere Autos zum Beispiel.
 

Jede Woche neu beim Stifterverband: 
Die Zukunftsmacher und ihre Visionen für Bildung und Ausbildung, Forschung und Technik

Autor: Timur Diehn
Produktion: Webclip Medien Berlin
für den YouTube-Kanal des Stifterverbandes

Das Interview entstand am Rande des Zukunftskonkgresses 2016 des 2b AHEAD ThinkTanks.

Transkript des Videos

Rethink the way the data is dealt with, it shouldn't be yes or no. It should be smart, it should be segmented, it should be secure, encrypted etc. etc.

We're gonna introduce ourselves to things like if you want to, you know, have a smart refridgerator, it sounds beautiful but what makes it smart? It's only a simple thing. The refridgerator knows who you are. It knows your behaviour. It knows what do you buy, what you eat and what needs to be purchased on your behalf. And this is scary for me because before the chip I was like "OK, it's a smart refridgerator, it's fun, fantastic future!" Right now I'm saying: "No way! Who told you I want this?" I want it to be in a different way. I don't want the refridgerator to know everything about me. I want me and my, whatever you call it, internet profile that I control knows everything about me, I'm OK with that gathering that I control and encrypt and store in a way that is secure. And I'm ready to, you know, to be asked if I'm ready to do something. But I'm not ready to identify myself to every piece of hardware in a traditional way using my Facebook account or Google account or any account actually that doesn't belong to me. And this is what's happening where everything is going on and I want to change that. I really want to change that. We have to start thinking in a different way. We have to start to understand that when you're asked to click "I agree", it's a tradition. And I'm OK that right now we are having it but it's a wrong way to, you know, to deal with humans. Because it's not a choice, it's an ultimatum. You either basically click "I agree" or you're deprived of a service. Some call it a fair deal. My personal opinion is that it's not. It's not! And what I'm trying to create with this experiment where my team is investigating right now is that I'm trying to analyze the possible product scenarios and human interaction scenarios like what could people feel, anyway, what would be the user experience like in 5, 10 years from now? And I wanna create in a perfect world, create a situation when my data belongs to me physically, when it's protected by legislation, by constitution, and there is no difference for the court whatever it's my, I don‘t know, car which is something I purchased and I can do whatever I want, I can just burn it if I wanted. Unless it's harm to someone else I can do it, it's my car. I want to do the same with my data. And what is even more important, I want to be selective. I want to be in control of what data I give and to whom. Let me give you an example. Right now usually it's like all data about you is out there. And the companies who learned to gather it properly, they are doing it. And Google is doing it, Apple is doing it, Microsoft is doing it, a lot of companies are doing it on a wide scale with billions of accounts. And I think it's fine but right now nobody asks me how this data would be used. And it's either all, as I said, like all data or nothing, it's a "yes or no"-type situation. What I want to create in a perfect world scenario is, I want to create several types of data, several feeds that I can deal with differently. For example if there is a medical institute that is fighting cancer. And they are asking for data like "Listen, we need it", and they explain why like "The more data we have, the more effective would be, the faster we find the cure, we just need this medical data of yours". And I'm fine with it, OK, I'm gonna give them just medical data. But not all of it. They don't need my payments, they don't need my geotargeting, they don't need my emails, they don't need my surfing history, they don't need that. On the other hand, let's say there is a company like, any company like Google or Google Maps or whatever. Let's say there is a map involved. And I want to make it better. And they say "Listen, we need your data that is based basically on how fast you travel in the city because we want to learn the traffic." Well, OK, ask me just this data but again you don't need anything else, right? And those are just simple examples where I could voluntarily give away this data, same with the government, for example if they want to learn something, or with the bank if they want to know if I'm the person right here, right now standing in front of this ATM throwing cash. They just want to know this. Again, they don't need to know my medical history, right? And right now it's either all, and it's scary. There are approximately 2,000 networks that are tracking your data, gathering, you know, analyzing, you know, profiling people. And it looks like data belongs to everybody but the users.

So there would be more security experts engaged in the conversation and there would be professional developers joining the community. And it's not such a big problem to make it totally secure. Most of the technology there is invented, to be honest. So it can be a very secure container or encryption key or whatever digital profile, whatever we call it. And recently I've got the patent, a U.S. patent in February, that basically makes it possible to make it secure because the patent is about converting biometrics and human body, unique human body fingerprints to basically unique identification. I'm talking about pulse, you know, heart rate, you know, temperature, the voice, biometrics, you know, everything, retina display in some traditional ways like a secret code phrase, for example. If you combine it all together it becomes a very secure way of doing whatever you want if you code it correctly and update it etc. etc. and if it's, what's more important, transparent because every security technology has to be public. Because otherwise it's not secure. And the best proof of it is encryption. Historically, not a single encryption mechanism was really good unless it was put online for everybody to check. And I feel that we should go the same way.