A Next.js application designed to help identify autism spectrum traits through advanced eye-tracking technology. This application provides a scientifically-backed, non-invasive tool for assessing visual tracking patterns that may indicate autism spectrum characteristics.
The primary goal of this application is to provide an accessible, early-stage screening tool for autism spectrum traits based on eye movement patterns. Research has shown that individuals on the autism spectrum often display distinctive eye-tracking patterns when following visual stimuli. This application captures and analyzes these patterns to help identify potential indicators, which can lead to earlier intervention and support.
- Eye Tracking Assessment: Non-invasive test that tracks eye movements while following a moving target
- Real-time Analysis: Processes eye movement data to detect patterns associated with autism traits
- Visual Reporting: Provides clear visualization of tracking patterns and assessment results
- Accuracy-based Risk Assessment: Categorizes results based on tracking accuracy thresholds
- User Authentication: Secure sign-up/login with Clerk Authentication
- Responsive Design: Works on desktop, tablet, and mobile devices with webcam support
- Frontend: Next.js 15, React 19, TypeScript, Tailwind CSS
- Authentication: Clerk
- State Management: Zustand
- AI/ML Components: TensorFlow.js, BlazeFace for face detection
- Data Visualization: Canvas API for eye movement visualization
- Testing: Jest, React Testing Library
The application uses TensorFlow.js and face detection models to track eye movements in real-time through the user's webcam. This approach allows for accessible testing without specialized hardware.
- Face Detection: The application uses BlazeFace (a lightweight face detection model) to locate the user's face in the webcam feed
- Eye Position Tracking: Once the face is detected, the system tracks the position of the eyes
- Pattern Analysis: As the user follows a moving ball in a square pattern, the application recorded eye movement data
- Data Processing: Eye movement data is analyzed for:
- Tracking Accuracy: How precisely the eyes follow the target
- Wiggle Score: Measurement of unwanted vertical/horizontal movements
- Square Pattern Detection: Whether the eyes follow the expected square pattern
- Saccade Frequency: Rapid eye movements between points
- Fixation Duration: How long the eyes remain fixed on specific points
The application uses research-backed thresholds to assess risk levels:
- 80% or higher tracking accuracy: Perfectly normal eye tracking (Low Risk)
- Above 60% tracking accuracy: Good eye tracking (Low Risk)
- 55-60% tracking accuracy: Moderate Risk
- Below 55% tracking accuracy: High Risk
The landing page with an introduction to the application, its purpose, and guidance on getting started.
The main assessment page with the following phases:
- Introduction Phase: Explains the test procedure and prepares the user
- Setup Phase: Configures the webcam and ensures proper face detection
- Ready Phase: Final instructions before starting the test
- Testing Phase: The actual eye tracking test where users follow a moving ball in a square pattern
- Results Phase: Displays comprehensive results including:
- Risk assessment based on tracking accuracy
- Square pattern detection status
- Detailed metrics (saccade frequency, fixation duration, etc.)
- Personalized interpretation and suggestions
A development and validation page that shows how the eye tracking calculations work with test data of varying accuracy levels. This page demonstrates the relationship between eye movement patterns and calculated risk assessments.
A simplified version of the eye tracking test that only shows the ball animation component. Useful for testing and demonstration purposes.
A diagnostic page that demonstrates real-time eye detection using the webcam feed.
User dashboard to access past assessments and start new tests (requires authentication).
- Initial Setup: The user grants webcam permission and positions themselves in front of their camera
- Calibration: The system detects the face and eyes to ensure proper tracking
- Test Execution: A ball moves in a square pattern, and the user is instructed to follow it with their eyes
- Data Collection: The application collects approximately 30 seconds of eye movement data
- Analysis: The collected data is processed to calculate key metrics:
- Wiggle Score: Converted to a tracking accuracy percentage (higher is better)
- Square Pattern Detection: Determines if the eyes follow the expected pattern
- Saccade Frequency: Measures rapid eye movements
- Fixation Duration: Assesses attention focus capability
- Results Generation: Based on these metrics, the system generates a risk assessment and personalized feedback
- WebcamFeed: Captures video input and processes it for eye tracking
- AnimatedBall: Renders the moving target that follows a square pattern
- EyePathCanvas: Visualizes the eye movement paths compared to the ideal square pattern
- EyeTrackingVisualizer: Provides heat maps and trail visualizations of eye movements
- ResultsPhase: Displays comprehensive analysis and recommendations
- Clone the repository
- Install dependencies:
npm install- Set up environment variables:
Create a .env.local file in the root directory with the following variables:
# Clerk Authentication (required for auth features)
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=your_clerk_publishable_key
CLERK_SECRET_KEY=your_clerk_secret_key
NEXT_PUBLIC_CLERK_SIGN_IN_URL=/sign-in
NEXT_PUBLIC_CLERK_SIGN_UP_URL=/sign-up
NEXT_PUBLIC_CLERK_AFTER_SIGN_IN_URL=/dashboard
NEXT_PUBLIC_CLERK_AFTER_SIGN_UP_URL=/dashboard
# Next.js
NEXT_PUBLIC_APP_URL=http://localhost:3000
Note: Authentication is only required for dashboard features. The eye tracking test can be used without authentication.
- Run the development server:
npm run dev- Open http://localhost:3000 with your browser
- Navigate to the eye tracking test page to start an assessment
- Modern web browser (Chrome, Firefox, Safari, Edge)
- Webcam access
- Good lighting conditions for optimal face detection
- Stable internet connection for initial loading of ML models
- All eye tracking processing happens locally in the browser
- No video data is sent to external servers
- Assessment results can be saved to user accounts if authenticated
npm run dev: Run the development servernpm run build: Build the application for productionnpm start: Start the production servernpm run lint: Run ESLint to check for code quality issuesnpm test: Run tests
- TensorFlow.js Documentation
- Next.js Documentation
- Tailwind CSS Documentation
- Research on Eye Tracking and Autism
This project is licensed under the MIT License - see the LICENSE file for details.