Differential Privacy

The fourth method is differential privacy which works by adding statistical noise that gets canceled out when aggregating over multiple entries. So if John wants to upload his curve of symptoms -- it could be fatigue for example -- then instead of sending the green curve, which is the authentic curve, the app adds some noise to the curve and uploads a slightly noisy curve to the server. As you can see on the top left multiple such red curves appear at the server. The server does not know what the authentic symptom curves are for each of these phones. When the server adds up all these red curves, the server gets the blue curve, and that is the final estimate for the server. Jane can look at this final estimate and realize that people like John and others who are infected have these particular types of symptom trajectory. At the same time, Jane cannot know who contributed to which symptom curve but gets a precise final estimate because this has been averaged for dozens or even hundreds of people. [1] [2]

References

[1] Differentially Private Supervised Manifold Learning with Applications like Private Image Retrieval

[2] DAMS: Meta-estimation of private sketch data structures for differentially private COVID-19 contact tracing, PPML-NeurIPS 2020