Executive Summary For Software Engineers

Writing software is hard, and it only becomes harder when you have to think about misuse and abuse. There are a few basic principles that can help you write software that is less likely to cause harm - to your systems, your users, and the world.

  1. Nothing should ever be unlimited - When defining any functionality, ask your PMs "what is the maximum amount of per-user use this feature might get", and then enforce that via a hard limit. Limits help you keep your systems running, and alert you to the cases when users are misusing the product at scale.
  2. Don't Treat Users Equally - Users are not all the same, and they don't all deserve the same amount of trust. Build Graduated Features into all pieces of powerful functionality.
  3. No Robots Proactive action is nescessary if you want to prevent automated access. Hope is not a strategy, nor is authentication (alone). Cryptographic signatures, non-guessable ids, rate limiting, and dynamic quotas are all helpful flavors of the same idea: humans and computers tend to access information differently, and you can use those distinctions to your advantage .
  4. Ask "should we" - The biggest abuse headaches arise from features that are rushed out the door. Find ways of inserting engineering mindsets into the product development process, and get buy-in that things should not launch until there is a plan for abuse. When you get your organization to think this way, Omission becomes a potent possibility.

More generally, become familiar with the menu of options you have for changing product design , and get the buy-in of leadership and PM to prioritize the long-term health of the system and it's community over short term objectives.

This falls to you because at the end of the day, SWEs bear the internal costs of abuse. We're asked to do more with less, to drop everything to firefight the current crisis, and are often encouraged to reach for content moderation to douse the problem.

SWEs are familiar with this mindset: it is the same short term thinking and pressure that accretes technical debt. The same solution applies: long term investments in reversing bad choices, more thorough design processes, and engineering autonomy to push back against short-term thinking. All of these make technical systems more robust, and each can make the product more robust against abuse as well.