top of page
Search

Guiding Activities in NeuroAI: From Market Study to Data Collection

  • Writer: Bassem Ben Ghorbel
    Bassem Ben Ghorbel
  • Feb 19, 2025
  • 4 min read

Updated: Mar 3, 2025

In the investigation phase of our NeuroAI project, we've meticulously defined a series of guiding activities to ensure our approach is legally compliant, technically sound, and ethically responsible. In this post, we share how we are studying the market, exploring existing solutions, evaluating cutting-edge AI models, and sourcing high-quality datasets for multi-modal emotion recognition.




Existing Solutions

Current solutions in AI-driven mental health care primarily include chatbots and self-help applications. However, these technologies face a lack of adoption in clinical settings and are mainly used in research environments rather than real-world therapy applications. The challenge remains in bridging the gap between experimental AI models and their practical integration into mental health practices.


Market Study

Our research indicates a growing demand for accessible mental health care, driven by:

  • The rise of telepathy-related AI applications and brain-machine interfaces.

  • Advancements in neuropsychology and cognitive AI.

  • The increasing global need for mental health support due to stress, anxiety, and lifestyle changes.

These trends highlight the necessity of a robust AI-powered system that integrates multiple physiological and behavioral signals to provide reliable emotional insights.


Legal Constraints


Anonymization & Ethical Use

  • Legal Basis: Organic Law No. 2004-63 (July 27, 2004) on the Protection of Personal Data

  • Key Points:

    • All biometric or personal data must be anonymized to protect individual identities.

    • Any personally identifiable information (PII) is handled in strict accordance with data protection regulations.

    • Transparency is critical—our process clearly states how data is collected, used, and stored.

Website Terms of Use & Licenses

  • Legal Basis: Law No. 94-36 (February 24, 1994) on Literary and Artistic Property

  • Key Points:

    • Data extracted from websites must adhere to the site’s terms of service.

    • If data is copyrighted or restricted, explicit permission or licensing is required.

    • Unauthorized data scraping can lead to legal consequences under intellectual property laws.

Compliance with Legal Frameworks

  • Our work ensures strict compliance with Tunisia's copyright, cybersecurity, and data protection laws when using online data sources.

AI & Ethical Guidelines

  • Transparency: Clearly explain our AI’s decision-making process in detecting emotions.

  • Fairness & Bias Mitigation: Strive to prevent discrimination based on gender, ethnicity, or other protected attributes.

  • Accountability: Define responsibilities for any errors, biases, or potential harm caused by the AI system.


Explored Technologies

To achieve accurate multi-modal emotion recognition, we are exploring the following technologies:

  • EEG (Electroencephalography): Brainwave activity monitoring.

  • ECG (Electrocardiography): Heart rate and rhythm analysis.

  • EOG (Electrooculography): Eye movement tracking for cognitive and emotional state assessment.


Reviewing State-of-the-Art Research

We have conducted an extensive review of research papers from leading journals and conferences, including:

  • Brain-to-text using EEG: Studying how brain signals can be translated into text.

  • EEG-based emotional recognition: Analyzing various EEG signal axes.

  • Tone analysis and emotion tracking: Understanding vocal intonations and their psychological impact.

  • Eye-tracking in emotion detection: Assessing gaze patterns for cognitive load analysis.

  • Psychology of emotions: Researching foundational psychological models of emotion processing.

Sources include IEEE Access, International Journal of Environmental Research and Public Health, Association for the Advancement of AI (AAAI), ICMLT, and A-class conferences like NeurIPS.


Data Research

To power NeuroAI, we relied on several reputable platforms and repositories:

  • IEEE Dataport

  • Kaggle

  • GitHub

  • Hugging Face

  • PhysioNet: A trusted resource for complex physiologic signals (e.g., ECG, EEG)

  • mindBigData

  • openneuro


Our data sources include publicly available datasets and proprietary research findings. Key datasets powering our models:

  • SEED-VII: EEG and eye movement data covering seven emotions (happy, sad, fear, disgust, neutral, anger, surprise).

  • MODMA: Multi-modal dataset for mental disorder analysis available on Kaggle, GitHub, and ukDataService.

  • AffectNet: Facial expression-based emotion recognition available on Kaggle.

  • ZuCo: Eye-tracking and EEG data for cognitive state modeling.

  • Inner Speech EEG Dataset: Captures neural signals related to inner speech.

  • Kumar's EEG Imagined Speech Dataset: Tracks brain signals for speech processing.

  • "[IMAGENET] of The Brain":

    • Contains 70,060 brain signals (3 seconds each) recorded in response to random images from the Imagenet ILSVRC2013 dataset.

    • Sourced from mindBigData and recorded from a single test subject, David Vivancos.

  • FEEL Dataset:

    • Combines force, EEG, and emotion labels recorded while participants played video games.

    • Accessible via IEEE Dataport.


These datasets allow us to build robust, generalizable models for emotion recognition.


Pre-Trained Models

We leverage pre-trained AI models to accelerate our development:

  • Voice-Based Dialogue Generation: Enhancing AI-human interactions.

  • Emotion Detection: Using DeepFace and FER+ for facial emotion recognition.

  • Tone Analysis: Employing Vader for sentiment and tone classification.


Budget & Hardware Requirements

For real-world experimentation and model validation, we are considering:

  • CardioTek G1 300 EKG Machine: $1,295 – $2,510.

  • Pamel CombyCap EEG Cap: $143 – $500.

  • Full HD Webcam: $70 – $500.

  • EOG Hardware: $140 – $150.


Related PVs:


Conclusion

By systematically studying existing solutions, exploring state-of-the-art research, and sourcing high-quality datasets, NeuroAI is set to push the boundaries of AI-driven mental health diagnostics. Our approach ensures ethical compliance, technical accuracy, and practical usability, laying the foundation for a transformative tool in neuropsychology.

Stay tuned as we continue integrating these guiding activities into every step of our development!

 
 
 

Recent Posts

See All
Meet the Minds Behind the Models

As we shift into the core of the Act  phase, the focus now moves from theory to practice. With our survey paper well underway and a...

 
 
 

Comments


©2025 by Virtus.

bottom of page