In my MBA class, I learned a quantitative technique called Discriminant Analysis. Discriminant Analysis is useful for making prediction, e.g. whether a person will buy the latest Harry Potter book or consume organic food.
In my assignment, I devised a model which predicted if a student would be a high scorer in examination. Prediction was made based on a number of questions asked, such as:
- How much time a student spends on doing homework?
- How much time he/she spends on sports?
- How much time he/she spends on surfing the Web?
- How many siblings does he/she have?
My classmate devised a model to predict if a person would be a potential terrorist. Here is a concern: If a person is predicted to be a terrorist, will he be subjected to unnecessary spying? Will computerized forecasting lead to discrimination?
In her article Forecasting human behaviour carries big risks, Christine Evans-Pughe writes:
The Surveillance Society report from the Information Commissioner’s Office outlined worries about predictive social sorting on the grounds that could amount to discrimination, create new underclasses and that by totting up of negative indicators from health, school and other records, a predictive model could make its own worst predictions come true. “For instance, if your parents both have criminal records or you have a bad school attendance record because of poor health, even if you are the best-behaved kid in class, you will find that every teacher is likely to treat you with suspicion."
As a conclusion, Evans-Pughe sums up:
Computerized forecasting techniques are certainly useful for stores, but flawed when it comes to complex human issues.