What if we could use machine learning on brain signals? We can control the stimulus to a human brain, and we have some ability to record the brain’s activity through MRI scans. What if we train an computer to recognize what a person is seeing/feeling just from brain scans, with the forced stimuli as the true data? I haven’t seen any research like this, and it seems to be (relatively) simple, given the advances in computing power and AI.
What if we could have a universal cup-holder? Or, at least some cup-holder “adapter” that fits into standard car cup-holders and can fit any-size cup? I have an idea with an “expanding spring” on the outside, and an “contracting spring” on the inside, where “expanding” and “contracting” refer to the radius of the coils. The two would be have concentric plastic plates on the bottom that would be connected at the center.