Black box AI is a term used in the fields of artificial intelligence, digital transformation and big data. It describes computer programmes or algorithms that, like a "black box", make decisions whose exact functioning is incomprehensible to humans. Although AI systems often deliver impressive results, how the system arrived at a particular result usually remains hidden.
One example: In a company, the HR department uses AI software to select suitable applicants from hundreds of CVs. In the end, the system suggests some candidates, but no one can explain exactly why these people were selected and others were not. The AI is therefore considered a "black box", as neither users nor developers can transparently track the decision-making processes.
This lack of transparency can be problematic when making important decisions, for example when it comes to finances or personnel. Companies and decision-makers should therefore be aware of the advantages and disadvantages of Black Box AI and pay attention to transparency in order to create trust and minimise risks.