You have a dataset that includes confidential dat
a. You use the dataset to train a model.
You must use a differential privacy parameter to keep the data of individuals safe and private.
You need to reduce the effect of user data on aggregated results.
What should you do?
Differential privacy tries to protect against the possibility that a user can produce an indefinite number of reports to eventually reveal sensitive data. A value known as epsilon measures how noisy, or private, a report is. Epsilon has an inverse relationship to noise or privacy. The lower the epsilon, the more noisy (and private) the data is.
https://docs.microsoft.com/en-us/azure/machine-learning/concept-differential-privacy
Felix
4 months agoHenriette
4 months agoMicah
5 months agoTimmy
5 months agoRolande
5 months agoNettie
5 months agoTarra
5 months agoLenita
5 months agoMose
5 months agoFranchesca
5 months agoStaci
5 months agoSkye
5 months agoDalene
5 months ago