welcome to introspection ft. harsehaj! âď¸ iâm harsehaj, and always up to something in social good x tech.
this publication is a place for me to reflect on productivity, health and tech, and drop unique opportunities in the space right to your inbox daily. if youâre new here, sign up to tune in!đ
scroll to the end for my daily roundup on unique opportunities!
onto todayâs topic: brain computer interfaces đ§ Â
ever see those caps with wires sticking out and hooked up to a computer? if you havenât, this is awkward. if you have, thatâs an application of brain computer interfaces (bcis) â one of my favourite fields in tech.
iâll give a quick overview. :)
a bci is a system that interprets brain signals and translates them into commands that can control external devices like computers, prosthetic limbs, or even smart home systems. this process can be broken down into 4 steps:
1) capturing brain activity đÂ
the brain generates electric signals as a result of neuron activity, and bci systems can capture this non-invasively through eeg data collection, or invasively through electrocorticography or implanted micro-electrodes. each method corresponds to electric signals for different brain activity.
2) signal processing đ§šÂ
that data collected in step 1 typically isnât usable, so naturally, the next step is to clean it up, which is a process involving preprocessing and feature mapping. first, the ânoiseâ generated by muscle movement, eye blinks or any external interference is filtered out so that meaningful signal patterns linked to different features can be extracted. these signal patterns, or features, are then translated to intent. for example, if an eeg detects increased activity in the motor cortex, the system could infer that the user is trying to move their hand.
3) machine learning for translation đ¤Â
we typically donât leave the overwhelming work of parsing through thousands of different scans/frames to humans. instead, bcis make use of machine learning to do the command translation once meaningful patterns are extracted.
4) executing commands đŚžÂ
once a command is decided, itâs sent to an external device including (but not limited to) a robotic prosthetic limb, or wheelchair. for example, an eeg headset could detect brain activity related to focus and would let a user type without touching a keyboard. â¨ď¸Â
bcis are an exciting intersection of neuroscience, computer science, and engineering. personally, it also feels like our closest reach to science fiction. not to mention, there are so many applications to do good in the health sector with this technology.
Share introspection ft. harsehaj
{{rp_personalized_text}}
Or copy and paste this link to others: {{rp_refer_url_no_params}}
Receive 2 free months of running training by participating in the 2025 RunDot Project.
The RunDot Project is an annual research initiative that helps runners reach their true potential through optimized training methods.
Apply here to find out if you qualify (it only takes 3 minutes).
daily opportunity + resource drops đď¸
check out leapyearâs program to heads-down build for a year and get $30,000 in funding
register to compete in hoohacks, uvaâs hackathon on march 29-30
if youâre interested in my project/life updates, check out my other bi-monthly newsletter: harsehajâs headquarters. or, just reply to this email saying youâd like to opt in to emails from my headquarters.
interested in promoting an opportunity on my blog? check out my promo package & shoot me a line at harsehajx@gmail.com.
teehee,
harsehaj âď¸
PS. if you have a question/topic you think would be interesting for me to reflect on, donât hesitate to reply to my emails with any ideas you ever have. :)