Centre for AI, Culture and Society (CAICS)

About us

The Centre for AI, Culture and Society (CAICS) develops ethical and trustworthy intelligent software solutions for business, organisations and society. We test, validate and verify AI systems to ensure that they are fit for purpose and we help organisations use and interpret their data wisely and ethically.

Our work involves championing diversity, inclusion and sustainability through good AI practices. In our experience, simple improvements in data analysis can lead to substantial profit increments. We have a large team of subject specialists who understand your sector and can provide expert advice on how to gain maximum value from your data.

We analyse data sets and systems for bias, interpretability, brittleness and robustness. We carry out risk analysis and classification to understand which restrictions and controls should be applied. We develop bespoke AI solutions for organisations.

Person working on AI software

Related courses

Research impact

Our research and impact strategy underpins the Centre's strategic goal of being committed to international and world-leading research and development of ethical and trustworthy intelligent software solutions for business, organisations and society. 

We pursue and support ethical AI research of the highest quality that is, or has the potential to become, recognised as internationally excellent or world leading.

The Centre for AI, Culture and Society shows understanding of how different businesses and organisations can experience challenges when on-boarding AI. We offer consultancy with access to a broad base of talent, from specialists in cutting-edge technology to experts in social science and in the integration of technology into business processes.

We recognise that our staff are key to our success and we will continue to invest in research and to support and develop opportunities for research and knowledge exchange with industrial, commercial and public sector partnerships.

Leadership

Nigel Crook

Professor Nigel Crook

Associate Dean: Research and Knowledge Exchange (ADRKE)

View profile

Esra Karahasanoglu

Esra Karahasanoglu

Assistant Director, Software Engineering

View profile

Kevin Maynard

Kevin Maynard

Co-Director at Institute for Ethical AI

View profile

Arijit Mitra

Arijit Mitra

Head of Innovation, Institute for Ethical Artificial Intelligence

View profile

Selin Nugent

Dr Selin Nugent

Assistant Director - Centre for AI, Culture and Society

View profile

Membership

Staff

Name Role Email
Chara Bakalis Deputy Head for Strategy and Development, Reader in Law cbakalis@brookes.ac.uk
Dr Fabrizio Bonatesta Reader in Thermofluids fbonatesta@brookes.ac.uk
Dr Tjeerd Olde Scheper Reader in Computer Science tvolde-scheper@brookes.ac.uk
Dr Matthias Rolf Reader in Computer Science mrolf@brookes.ac.uk

Projects

Active projects

Project title and description Investigator(s) Funder(s) Dates

MoFHS

Software application developed by the Centre to assist companies in reducing the costs of integrating IT systems and facilitating digital transformation by taking the complexity out of these tasks and reducing risk.
Innovate UK

RESISTIRÉ (Responding to outbreaks through cocreative inclusive equality strategies)

Oxford Brookes University is coordinating the delivery of the app and data visualisations. The project is being implemented from an equality perspective and will develop policies that understand and monitor social change. The consortium will be focusing on how the impact of Covid-19 risks widening inequalities, such as race, education level and gender.
Horizon 2020

AIDA Incubator - Research Incubator

The AIDA (Artificial Intelligence and Data Analysis) Incubator at Oxford Brookes University is a world-class centre focused on helping companies in the professional service sector adopt advanced quantitative techniques including AI and data analysis. This work is funded by Research England. The ambition is to help these companies maximise their profitability, and improve their productivity through the use of technologies in areas such as law, HR, design and architecture, whilst minimising the potential for adverse impacts.
RED Fund