Bullying will be easier to detect with artificial intelligence thanks to its proximity to minors.
WhatsApp is a startup that collaborates IBM Development of a robot with artificial intelligence that can detect cases of bullying among children and young people anywhere in the world.
According to UNICEFBullying prevents about 150 million young people between the ages of 13 and 15 from learning, so it’s a problem that needs to be tackled closely to stop this outcome.
The tool can detect through conversations if a minor is suffering physically or psychologically at school. A process that parents and teachers can discover after as little as nine months.
“The longer bullying goes on, the more likely it is to cause long-term damage to the victim’s mental health. To solve the problem, WatsomApp wanted to help teachers identify and address bullying more quickly.” Gemma GutierrezUser experience consultant for a company responsible for AI.
You may also be interested in: Microsoft Considers More Limits for Its New AI Chatbot
Bullying will be easier to detect with artificial intelligence thanks to its proximity to minors.
He supported the work of the startup IBM Cloud Development of a series of online games for children Spain. In this dynamic, certain characteristics of minors were analyzed to solve the challenges of the games and to identify some of the characteristics.
This data is collected by the company to identify the personality that best suits each child and thus develop the system.
“We realized that we could use analytics to identify the personality of each student in the classroom. By providing this knowledge to teachers and school psychologists, we knew we could help them intervene to stop bullying in their classrooms before it was too late.
You may also be interested in: Data Protection in the Age of Artificial Intelligence
Bullying will be easier to detect with artificial intelligence thanks to its proximity to minors.
Within the games, the system collects data on possible bullying situations for both victims and perpetrators, detailing aspects such as family status or the way they empathize with other colleagues.
“Educators can only address bullying if they know what’s going on.” Our platform can bring these issues to your attention in just one month, as opposed to the nine months it usually takes to identify bullying,” Gutierrez said.
In addition, the company ensures that minors are more open when they talk about these types of things through an AI avatar than with parents or teachers, so they will try to take the tool outside. Spain and Peru where more than 3000 children and young people participated.
You might be interested: This was ChatGPT’s response when asked for their opinion on Colombia
is an application that virtually accompanies people so that they know and understand their rights using technology.
It’s called Wardiam and it guides the process that needs to be done before a dispute, in addition to being a resource bank for investigative purposes by lawyers and law firms.
The platform is available in web and mobile versions and uses artificial intelligence to study the country’s legislation, currently in Colombia, to provide accurate and up-to-date information on rights.
Its operation is very simple: the user asks Vardiam what to do when faced with a problem or shares a current situation, and the tool informs them of the available options according to the case and actions.