Homomorphic encryption

Published by admin on

Genetic data is the source of information about our health, genealogy and personal identity. Considering that currently all medical exams or blood samples are digitized, they are an easy target for hackers. Why worry? For example, you want to sequence your genome to know if one day you may suffer from a rare genetic disease, for instance. And let’s assume that the test comes out positive. Which employer would like a worker with such a risk? Which insurer will give you health care?

Doctors and researchers think that understanding how genes influence disease will require the collection of genetic and health data from millions of people. They have already begun to plan projects, such as the Precision Medicine Initiative of the then US President, Barack Obama, and the 100,000 Genomes Project of Great Britain. Such a massive task will probably require harnessing the processing power of cloud computers, but the online security breaches in recent years illustrate the dangers of entrusting huge and confidential data sets in the cloud. The administrators of the Genotypes and Phenotypes database of the National Institutes of Health of the USA. UU (DbGaP), a catalog of genetic and medical data, are so concerned about security that they prohibit users from storing data on computers that are directly connected to the Internet.

Homomorphic encryption could address these fears by allowing researchers to deposit only a mathematically encoded or encrypted form of data in the cloud. It involves encrypting data on a local computer, then loading that encrypted data into the cloud. The calculations in the encrypted data are made in the cloud, and then the encrypted result is sent to a local computer, which decrypts the response. If potential hackers would intercept the encrypted data somewhere along the way, the underlying data would remain secure. The foregoing would be in complete harmony with our position on dataism.

Indeed, by 2020, 50 billion objects of our daily life will be interconnected via Internet (what is commonly called the Internet of Things), and more important, they will share all the data: how much milk do we drink per day, what program do we see on Netflix , how much light we use, what is our blood pressure, etc.

Such disclosure is opposed to the human right to privacy. I do not want Apple to know what my blood pressure is when I’m running in the morning. However, does Apple really care about this fact about James? The answer is no. Apple is only interested in the mere fact, which together with the data of other Apple Watch users lets you know that when people run, the blood pressure rises. And this final data in turn serves to identify health problems and their solutions.

At present, all our culture is egocentric; the I and the ME. I have human rights; I have the right not to disclose my information. However, the truth is that nobody is interested in your I; what counts today are the data. And the data does not work if it is not shared; if they are not processed; and therefore, there are no conclusions of this process. Conclusions that, however, could allow the advancement of science, and the wellbeing of humanity.

Now, many consider this dataism, according to Harari’s expression in his Homo Deus, as a religion. God is the Great Unique Process (GPU) and we are his servants. But such a vision is false, because it is based on the lie that we have innate rights. When we format a hard drive, we have the free decision to install a firewall to avoid external intrusions, or not install it and share the disk with others, in the GPU, but with the risk of being a victim of a virus or any other attack cybernetic. The same applies to humanity. Do we continue with our firewalll called human rights based on individuality, or do we share with the community through the unlimited flow of data? The defenders of dataism in general are based on a simple argument: if you have nothing to hide, why are you afraid that your information will be published? Personally, I would say it differently: do you really think that in the GPU, where there are millions and billions of information someone cares that JOHN X, born on x, living on x street, ate spaghetti yesterday? I do not think so, unless you’re a movie star. And even so, what does it matter to the star that this information is known? Of course, the counter-argument is that the star does not necessarily want everyone to know that he has hepatitis, and such information can also hurt his career. However, the solution is quite simple: it is only a matter of anonymizing the data, since they are the ones that count, not the individual that provides them. And thus we come back to Homomorphic encryption. It’s time for #disrupture with its #legaldisrupture!

Categories: E-com & Bots

en_USEnglish es_MXSpanish