Amsterdam based BrainCreators wants to explain neural network models with new partnership

Dutch software company BrainCreators has announced a partnership with SFI Research Center for Data Analytics. The partnership with Dublin-based Insight SFI Research Center for Data Analytics will lead to progress in finding structure in neural network models in one year. BrainCreators is a specialist in intelligent digital inspectors based on AI technology and it now wants to make neural network models easy to understand for human users.

The software company says that the importance of understanding these neural models is greater now since these model’s choices can “change lives”. The neural network models form the backbone of a number of decisions made in the field of law or medicine. As a result, these neural models must not only be correct but correct for the right reason. The partnership between BrainCreators and Insight SFI Research Center for Data Analytics will make these life changing models explainable to humans.

Making neural network models explainable

The need for making neural network models explainable comes from the fact that many AI applications are currently based on ‘black box models’. This makes it difficult to explain an application’s decision where the model is characterised by achieving better results often at the expense of explainability. With the black box model, companies can automate difficult tasks but the choice made by the AI model becomes difficult to understand for human users.

There is definitely an urgent need for scientific progress in this area, which is both technically challenging and of social importance. These models can inadvertently lead to discrimination or a lack of trust. The neural network models are typical black boxes too and are the most commonly used in today’s deep learning systems. BrainCreators sees a possible problem with this and thus looking to find a structure in neural network models in one year.

Maarten Stol, Chief Scientist at BrainCreators explains that understanding the internal structure of a trained neural network is currently not possible. A neural network consists of many connections and each connection makes a small statistical contribution to the decisions made by the model. When the model makes a mistake, it becomes difficult to identify which connection was responsible for the mistake. Stol sees a fix to the problem by organising the visual objects and concepts in a way that is understandable to humans. This will lead to end users being able to better understand whether a model is correct for the right reasons.

“Scientific progress in the field of neural networks is important, because the choices that AI applications make increasingly have an impact on society and people’s lives. We have therefore decided to collaborate with the Insight SFI Research Center for Data Analytics from Dublin and share our knowledge and experience,” Stol said. “In 2022, we want to work together with the researchers from the Insight SFI Research Center for Data Analytics towards a scientific publication that contributes to a better application of neural networks. The goal is to then also use these findings for our intelligent digital inspectors,” he adds.

“Thanks to the collaboration with BrainCreators, we can better attune our research to practice. All of our interactions and scientific discussions with BrainCreators are therefore very productive as they often come up with relevant feedback right away,” Viktor Horta, PhD student at Insight SFI Research Center for Data Analytics says in a statement. “Collaborations like the one with BrainCreators are valuable and they don’t happen very often. This collaboration is essential to tailor our methods to the practical needs of the field. In this way, we can transfer our results to real-world applications.”

While AI is the ability of a computer system to simulate human intelligence, it is still not able to do so without bias. AI bias has become a real concern for researchers and technologists and it has been found to result from human cognitive bias. The partnership between BrainCreators and Insight SFI Research Center for Data Analytics will not only help limit this bias but also make human users understand that with understandable models.

BrainCreators: What you need to know

BrainCreators is a software company that was founded by three entrepreneurs who met each other in the 1990s through Artificial Intelligence training at the University of Amsterdam. With its intelligent automation, BrainCreators can automate repetitive actions, leading to improved quality of life. The BrainMatter platform allows organisations to quickly develop digital inspectors, which can be used directly as a digital service in the physical world.

2048 1366 Editorial Staff
My name is HAL 9000, how can I assist you?
This website uses cookies to ensure the best possible experience. By clicking accept, you agree to our use of cookies and similar technologies.
Privacy Policy