In December we hosted a session as Part of Tech Week 2020 that focused on “The Importance of Diversity in Tech“. Don’t worry if you missed it, we’ve summarized below some of the key take away points from the session and you can watch the full recording from this and all the other sessions here.

Diversity in Tech – Challenge Faced

Despite an increased focus on diversity, the tech sector still has a lot of work to do. The percentage of women in tech in the UK increased a few percentage points in 2020 to 20%, which is a record number but still very low. Black women in technology are particularly under-represented (only 0.7%).

The Importance of Diversity from a Risk Perspective

Innovation is best served by a diverse team. Tech companies must innovate consistently to stay ahead of the market and attract the best talent with the right culture. Improving diversity is the right thing to do, but it is also a business imperative. In research undertaken by Accenture pre-COVID-19 it was found that the most equal cultures are 6 times more innovative than the less equal ones; the most equal and diverse cultures are 11 times more innovative. This encompasses diversity in terms of gender, age, ability, sexual orientation, religion etc. Problems emerge from a lack of diversity, for instance, an AI facial recognition tool which was developed based on only a sample of certain skin tones is likely to produce undesirable results.

Legal Risks and Regulation

There are a range of legal risks to consider when it comes to the development of new technology, particularly in relation to the use of data in technology products and services. For example, if unrepresentative data is used to develop an AI tool then this may lead to biased or discriminatory outcomes. In terms of existing legal frameworks, there is not currently much in the way of AI specific legislation around the world, although this continues to be discussed in a number of countries (we discussed the EU’s AI white paper earlier this year). There is a lot of regulatory focus on AI and a range of AI materials published by organisations around the world.  In the UK we have a range of regulators and other bodies considering the impact of AI (including the ICO, FCA, the Office of AI, AI Council and the UK Regulators Network). In 2020 the ICO published AI guidance and an AI auditing framework.

Importance of Lawyers

When it comes to new technology, all involved parties need to pay attention to the risk of unintended consequences. For example, when developing AI solutions there is a risk that individuals could be adversely effected by the outputs of a poorly designed algorithm. Tech is designed by people and despite media references to “rogue algorithms”, they generally do what they are designed to do. Lawyers have an essential role to play in helping to ensure that relevant risks are identified and mitigated. But they also provide their own, different, perspective, plus they can add diversity to the team. Lawyers working with tech teams should be asking a range of questions. What does the product do? What are the consequences if the product works / doesn’t work? Are the data practices being used responsible and ethical? How secure is the data? Could the product be problematic – who could it impact and how? How diverse is our team designing the tech? If it’s not diverse, how do we deal with any blind spots?

Responsibility

Identifying what issues could arise and who should be responsible for such issues should be considered as part of any responsible tech and ethical toolkit. If the AI is self-learning and could learn to make decisions which lead to bad outcomes it will need to be monitored and corrected and retrained when problems are identified. In any case, checking for potential risks and issue should be part of any testing regime.

Mitigation of Risk in AI

It is important to understand, and be as transparent as possible, in terms of how decisions of the technology could impact end users (or other interested parties). Any organisation looking to develop or deploy AI should carry out appropriate risk identification and mitigation at each stage of the project life cycle. Mitigations may be technical, operational or contractual. 

Getting help from the outside

Tech is changing rapidly and it may be difficult for some organisations to keep up to date and find appropriate skills in-house. Some companies may need to bring in external resources or work with third parties to develop certain products or ensure it has the right capabilities. A focus on diversity and responsible and ethical tech development will be very important when partnering with third parties.

It’s also the case that small organisations may find it difficult to find a diverse set of stakeholders in-house. In those circumstances, they may consider introducing representatives from outside bodies to help ensure that they have access to a diverse group of views and experience when building their products and services.

Ongoing Compliance

In terms of practical compliance steps that companies can take to help ensure that they develop tech in a responsible manner, risk assessments, algorithmic assessments and data processing assessments should be conducted, both pre-deployment and on a regular basis post deployment. Frequent assessments will allow for a dynamic review of the data and should be multi-faceted and multi-layered looking at all relevant risks and including all relevant stakeholders. Ultimately, there must always be meaningful human oversight of technology, whether or not there is personal data involved in the product or service.

How can lawyers stay up to date with tech law and developments?

Some suggestions from the panel:

  • Check out updates from SCL at https://www.scl.org/
  • Read Baker McKenzie’s updates.
  • Follow what the ICO and the other regulators are doing.
  • Check out the
  • Follow @computersandlaw @altrishaw @SarahBurnett @sumolaw on Twitter!

Thank you to our wonderful guest speakers for joining us:

Sarah Burnett – Founding Partner & Head of Technology Immersion and Market Insights at Emergence

Meera Doshi – Legal Counsel at ThoughtWorks

Rory O’Keefe – Director of Legal Services UK at Accenture

Patricia Shaw – CEO and Founder of Beyond Reach & SCL Trustee

Author

Sue is a Partner in our Technology practice in London. Sue specialises in major technology deals including cloud, outsourcing, digital transformation and development and licensing. She also advises on a range of legal and regulatory issues relating to the development and roll-out of new technologies including AI, blockchain/DLT, metaverse and crypto-assets.