diff --git a/os/softwarefairness.html b/os/softwarefairness.html new file mode 100644 index 0000000..51b1192 --- /dev/null +++ b/os/softwarefairness.html @@ -0,0 +1,242 @@ + + + + + SOLAR UCL - Software Fairness + + + + + + + + + +
+ + + + + +
+
+ +
+

About

+
+ + + +
+ +
+

This paper provides a comprehensive survey of bias mitigation methods for achieving fairness in Machine + Learning (ML) models. We collect a total of 341 publications concerning bias mitigation for ML classifiers. + These methods can be distinguished based on their intervention procedure (i.e., pre-processing, in-processing, + post-processing) and the technique they apply. We investigate how existing bias mitigation methods are + evaluated in the literature. In particular, we consider datasets, metrics and benchmarking. Based on the + gathered insights (e.g., What is the most popular fairness metric? How many datasets are used for evaluating + bias mitigation methods?), we hope to support practitioners in making informed choices when developing + and evaluating new bias mitigation methods.

+
+
+
+ + +

Resources

+ +
+
+
+ + +
+ +

Preprint

+
+ + +
+ +
+

Authors

+
+
+ +
+ + +
+ +

Max Hort

+
+ +
+
+ + +
+ +

Zhenpeng Chen

+
+ +
+
+ + +
+ + +

Jie Zhang

+ +
+ +
+ +
+ + +
+ + +

Federica Sarro

+ +
+ +
+ +
+ + +
+ +

Mark Harman

+ +
+
+
+
+
+

Acknowledgement

+
+ + + + + + + +
+ +

EPIC

+
+
+ +
+ +
+ + +
+
+ + + + +
+ + + + + + + + + + + + \ No newline at end of file