Eden Saig, a computer science student at the Technion-Israel Institute of Technology in Israel, developed the machine learning system which works by recognising repeated word patterns.
Saig developed the system at the Technion's Learning and Reasoning Laboratory, after taking a course in artificial intelligence supervised by Professor Shaul Markovich, of the Technion Faculty of Computer Science.
According to Saig, voice tone and inflections play an important role in conveying one's meaning in verbally communicated message.
In text and email messages, those nuances are lost and writers who want to signify sarcasm, sympathy or doubt have taken to using images, or "emoticons," such as the smiley face, to compensate.
"These icons are superficial cues at best. They could never express the subtle or complex feelings that exist in real life verbal communication," said Saig.
Recently, pages intended to be humourous on social networks such as Facebook and Twitter were titled "superior and condescending people," or "ordinary and sensible people."
Such pages are very popular in Israel, said Saig, and users are invited to submit suggestions for phrases that can be labelled as 'stereotypical sayings,' for that particular page.
By observing posts to these groups, Saig identified existing patterns. The method he developed enables the system to detect future patterns on any social network.
Since the content in these sections was colloquial, everyday language, Saig realised that, "the content could provide a good database for collecting homogeneous data that could, in turn, help 'teach' a computerised learning system to recognise patronising sounding semantics or slang words and phrases in text."
Saig applied machine-learning algorithms to the content on these pages and used the results to automatically identify stereotypical behaviour found every day in social network communication.
The quantification was carried out by examining 5,000 posts on social media pages and, through statistical analysis, gearing a learning system to recognise content structure that could be identified as condescending or slang.
The system was constructed to identify keywords and grammatical habits that were characteristic of sentence structure implied by the content's sentiments.
"Now, the system can recognise patterns that are either condescending or caring sentiments and can even send a text message to the user if the system thinks the post may be arrogant," said Saig.
When applied to other networking pages it may help detect content that suggests suicidal ideations, for example, or 'calls' for help, or expressions of admiration or pleasure, Saig said.
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.