GoodBot is an interdisciplinary community of professionals in tech, law, design, social sciences, and policy who share a commitment to ensuring that digital and AI systems are fair, trustworthy, transparent and accountable. We connect, amplify and strengthen a resilient Canadian responsible technology ecosystem through research, policy and practice in response to emerging societal impacts and governance challenges.
We believe that an unaccountable tech ecosystem that enables systemic harm requires a systems approach. We advance proactive digital and AI policy and we re-imagine healthy, inclusive and socially sustainable technology futures where communities and democratic societies can thrive.
We help small to medium sized nonprofits and technology companies to navigate a rapidly evolving AI and regulatory landscape by developing and embedding measurable social impact strategies into products, business models and governance.
our vision
A sustainable world where AI is rigorously governed and equitably enables the best of humanity.
our mission is to
To advance proactive digital policy and inclusive technology futures so communities and democratic societies can thrive.
about the name goodbot
goodbot’s emerged during a talk with a corporate partner exploring high rates of racist hate speech in the comments section of online newspapers which was being fuelled by bots. An executive suggested mobilizing his company’s army of ‘good’ bots to counter the ‘bad’ bots, an anecdote that highlights several important and common trends. These include the tech sector’s:
instinct to “tech” its way out of problems;
lack of intentionality in building and scaling tech with proactive consideration for societal impacts into products and business modesl;
lack of thoughtful policy and governance capable of advancing public interest;
acceptance of oversimplified framings of ‘good’ vs ‘bad’ tech;
tendency to respond to symptoms rather than systems-level issues, and;
adherence to incentive systems that often discourage responsible responses.
The name goodbot is therefore something of an ironic oversimplification of an incredibly complex and interconnected set of issues that need to be addressed simultaneously given the rapid recent advancements of technology. While most will agree that racism is bad, solving it is less straightforward. Is the correct answer to bar all speech that is considered offensive? How do you define and detect this speech? Is there a risk that such practices could inadvertently suppress the speech of marginalized groups and reinforce echo chambers? What does that mean for values that we consider sacred but that others might consider offensive? On platforms that optimize for engagement and profit, how do we reorient technology to optimize for society? Is that even possible? If so, what kind of institutions and frameworks do we need to get there?
These are but some of the complex questions that goodbot seeks to explore in building toward trusted, accountable, and socially sustainable technology ecosystems