EEG Quick Start¶
This guide walks you through loading, preprocessing, and scoring your first EEG recording with AIME LOC.
Installation¶
This installs:
- MNE-Python — multi-format EEG loading and preprocessing
- SciPy — Welch PSD computation
- matplotlib — visualization
The EEG Pipeline¶
- Load: Read any EEG format into an
EEGRecording - Preprocess: Bandpass filter, notch filter, re-reference
- Extract: Compute Power Spectral Density (PSD) epochs
- Score: Send PSD to server, receive TC scores
- Analyze: Visualize, export, compare
Step-by-Step¶
1. Initialize¶
from aime_loc import LOC
from aime_loc.eeg import EEG
loc = LOC(api_key="sk-aime-...")
eeg = EEG(loc)
2. Load EEG Data¶
# EEGLAB format
recording = eeg.load("subject01.set")
# EDF format
recording = eeg.load("subject01.edf")
# BrainVision format
recording = eeg.load("subject01.vhdr")
# NumPy array (requires sfreq)
import numpy as np
data = np.random.randn(32, 128000) # 32 channels, 500 seconds @ 256Hz
recording = eeg.load(data, sfreq=256)
# Consumer device with preset
recording = eeg.load("muse_session.csv", device="muse", sfreq=256)
3. Preprocess¶
# Default preprocessing (recommended)
recording.preprocess()
# Custom preprocessing
recording.preprocess(
bandpass=(1.0, 40.0), # 1-40 Hz bandpass
notch=60.0, # 60 Hz notch (US power line)
reference="average", # Average re-reference
pick_eeg=True, # Remove non-EEG channels
)
print(recording)
# EEGRecording(eeglab, 32ch, 500Hz, 300.0s, preprocessed)
# Audit trail
for step in recording.preprocessing_log:
print(f" - {step}")
4. Extract PSD Epochs¶
epochs = recording.extract_epochs(duration=2.0)
print(epochs)
# EpochSet(n_epochs=150, freq_range=0.5-45.0 Hz)
print(f"Shape: {epochs.shape}") # (150, ~90) — 150 epochs, ~90 freq bins
print(f"Duration: {epochs.duration}s") # 298.0s total
5. Score via API¶
profile = eeg.score(epochs, subject="sub-01", task="nback")
print(profile)
# EEGCognitiveProfile(sub-01, TC=23.40%)
print(profile.summary())
# EEG(sub-01, task=nback): TC=23.40% (300 epochs, 32ch @ 500Hz)
6. Explore the Results¶
# Overall TC score
print(f"TC Score: {profile.tc_score:.2f}%")
# Per-function scores
for func, score in profile.tc_by_function().items():
print(f" {func}: {score:.2f}%")
# Best and worst functions
print(f"Best: {profile.best_function}")
print(f"Worst: {profile.worst_function}")
7. Visualize¶
# 13-function cognitive radar chart
profile.radar_chart()
# PSD plot
from aime_loc.eeg.viz import psd_plot
psd_plot(epochs)
# Time series with TC annotation
from aime_loc.eeg.viz import timeseries_plot
timeseries_plot(epochs, profile)
# Save publication-ready figures
profile.radar_chart(show=False, save="eeg_radar.png", journal="nature", dpi=300)
8. Export¶
# JSON
profile.to_json("eeg_profile.json")
# CSV
profile.to_csv("eeg_scores.csv")
# LaTeX table (for papers)
print(profile.to_latex())
What Happens Server-Side?¶
When you call eeg.score(epochs), the SDK sends the raw PSD array to the AIME API. The server applies proprietary algorithms to measure how coherently all 13 cognitive functions operate together, and returns the TC score and per-function breakdown.
The PSD data sent is a frequency-domain aggregate — it contains no temporal patterns, no raw waveforms, and cannot be reversed to identify individuals.
Data Privacy
PSD arrays are ~650KB for a 30-minute recording. They are non-identifiable frequency-domain summaries, not raw EEG data.
MNE Power Users¶
If you need custom preprocessing (ICA, source localization, etc.), you can escape to MNE and come back:
recording = eeg.load("subject01.set")
# Get the MNE Raw object
raw = recording.to_mne()
# ... custom MNE preprocessing ...
import mne
ica = mne.preprocessing.ICA(n_components=20)
ica.fit(raw)
raw = ica.apply(raw)
# Return to AIME pipeline
recording = eeg.from_mne(raw)
epochs = recording.extract_epochs()
profile = eeg.score(epochs)
Next Steps¶
- Loading EEG Data — All supported formats and devices
- Preprocessing — Detailed preprocessing options
- Multi-Subject Studies — Batch processing for research
- Cross-Substrate Comparison — Human vs AI profiles