Technology

Artificial Intelligence Needs our Data. Should We Be Paid for It?

Artificial intelligence (AI) has the potential to create real revolutions in certain fields, including the medical area. For now, AI is in an early stage, and to develop it needs our data.

Robert Chang, an ophthalmologist at Stanford University in the United States, started working with AI several years ago. He believes that these advanced technologies can help in the early detection of patients who may develop diseases such as glaucoma. But first, it needs data. A huge amount of data to be analyzed for the algorithm to have high efficiency.

According to Wired.com, Robert Chang started with his patients, but it was not enough, because for an algorithm of this caliber to be extremely accurate it takes hundreds of thousands or even millions pieces of information.

Therefore, he obtained grants and appealed to collaborators from other universities to obtain them.

Moreover, he also went to donation registers, where people voluntarily make available their data for scientific purposes. Beyond the fact that he goes to collect them, the doctor has also faced the issue of data confidentiality, as well as the complicated rules regarding access to data by third parties.

After such a struggle, Chang believes he has found the solution. He is now collaborating with Professor Dawn Song at the University of California-Berkeley to create a safe way for patients to make their data and medical records available to researchers. Their solution is based on cloud technology and is built in such a way that the researchers do not see the information at all, even when used to “train” the AI ​​algorithm. Moreover, when their data is used, patients receive money.

This model has implications that go beyond the medical field. In California, Gov. Gavin Newsom has proposed introducing data dividends, which would bring some of the wealth of state-owned firms into residents’ pockets. Also, US Senator Mark Warner made a legislative proposal by which IT companies are obliged to pay users for their data.

All of these initiatives are based on the idea that IT giants make a lot of money from collecting and managing user data, without having to pay them.

In reality, however, the idea of ​​regaining our right to personal information is unclear. Unlike material goods, like car or houses, data is shared in an uncontrolled manner on the Internet, correlated with other resources, manipulated, and interpreted in all sorts of ways – more like Russian dolls. As the data is passed around, taking different shapes, it is impossible to estimate the real value.

That is why Dawn Song believes that the system should be changed from the root so that users have the right to personal data and, at the same time, others have access to the respective data. This can be done by completely anonymizing the information and by giving rewards to the users. The idea will soon be tested on a number of patients.

The prototype developed by Song and Chang is called Kara and uses a technique known as “differentiated confidentiality” that gives limited access to all involved parties to the elements needed to “train” the AI ​​algorithm. Patients must upload photos of relevant medical information to the platform and Chang will tell the AI ​​algorithm what to seek and how to correlate the findings.

The data storage is in the cloud, on Song’s platform, which encrypts and anonymizes data. Whereas all the processes take place in this “black box”, the researchers do not see the information they use. In addition, the process cannot be reversed, so the data cannot be decrypted.

Tempting, but complicated

The idea of ​​being paid to donate our personal data may, at first sight, seem advantageous to both parties. It’s just too complicated to put a price on someone’s data and determine which ones are more valuable.

In addition, such a system would generate new potential privacy and security issues.

“Such an agreement signed between the two parties could give the social network total rights to the client’s information. The person could be seduced by the seemingly advantageous offer, after which he would be surprised to find out that his entire online activity is being watched by countless third parties. (advertising partners, marketers, etc.) For example, even if the social network does everything possible to secure the data of the customers, third parties who take over those data for monetization purposes may be more negligent with the respective data,” notes Bogdan Botezatu, a computer security specialist at Bitdefender.

Leave a Comment

Your email address will not be published. Required fields are marked *

*