Skip to content

Latest commit

 

History

History
41 lines (31 loc) · 1.74 KB

ethics.rst

File metadata and controls

41 lines (31 loc) · 1.74 KB

Ethics

Types of Errors

Accuarcy does not always tell the whole story. Think about the ramifications of different types of errors (Type I & II) from the model, and tune accordingly.

Hidden Biases

Biases based on past data will be reflected in the model. Eg., job applicant being hired could be done based on gender, race, age.

Is it Better than a Human?

Do not oversell the model capabilities. Eg. A model can predict cancer from a mammogram, but a doctor should always be there to verify the result. You never know when you need to tune the model again because of some new features that were not included in the training sample.

Model used for Unintended Purposes

The user might end up using your model for other purposes that might be unethical or wrong.

Keeping Data Confidential

While data science has the potential to help businesses and humanity in general, there are many issues in confidentially as personal data can be collected. An internet restricted environment should be conducted where such data is being extracted for analysis, or if not all sensitive data should be hashed.

Then, the question comes again, should we even collect personal data in the first place? Lets say it is mandatory to collect data from patients with mental issues, we should probably prevent collecting their identity as doing so will prevent patients from seeking help in the first place. Having high assurance of their confidentially might be the way to reduce and control their illness from escalating, and this is more important than collecting their data for analysis. After all, that is the purpose of data analysis right?