-
Notifications
You must be signed in to change notification settings - Fork 2
Ethical Issues
- Are users' privacy expectations met by your software?
- Could the use of your software result in racial, gender, religious, or any other type of discrimination? How does your software try to mitigate this problem?
- Can your software by abused by some users to cause harm to other users? or to the public at large? How do you mitigate it?
Our software respects user privacy by securely storing all sensitive data, such as profile information, availability, and communication logs, in a protected database. Our matching algorithm focuses exclusively on objective factors like study habits, relevant coursework, and availability, which directly relate to academic collaboration. By excluding personal attributes such as race, gender, or religious beliefs, the algorithm minimizes any risk of discrimination. This approach ensures that all users have an equal opportunity to connect with others based on shared study interests and needs.
Additionally, we will design the user interface to avoid categorizing or sorting users by demographic criteria, reinforcing a neutral and inclusive user experience that promotes diversity within study groups. The matching algorithm will be wholly based on attributes such as schedule availability and listed study habits.
The chat feature provides valuable communication within study groups but introduces potential for misuse. To mitigate this, we offer users the ability to block others, which empowers them to control their interactions and avoid unwanted communication. These measures aim to create a safe and respectful environment, helping to minimize the risk of harm while maintaining open communication within the platform.