Even though they weren’t instructed to restrain themselves from considering such a solution, they were unable to “see” the white space beyond the square’s boundaries.Only 20 percent managed to break out of the illusory confinement and continue their lines in the white space surrounding the dots.The correct solution, however, requires you to draw lines that extend beyond the area defined by the dots.At the first stages, all the participants in Guilford’s original study censored their own thinking by limiting the possible solutions to those within the imaginary square (even those who eventually solved the puzzle).The content and keywords on these ad-videos is largely algorithmically composed, and optimized for maximum eyeball draw.(The preceding link is long and deeply creepy in its implications: it's a must-read.) And when algorithms go hog-wild to maximize eyeballs and/or sales you get weird and unpleasant results like this: (This came up because some idiot wrote a bot to sell tee shirts via Amazon, with the caption "Keep Calm and [X][Y]" where [Y and [Y] are phrases some sort of machine learning system scraping lists of verbs and pronouns.Starbucks closed thousands of stores Tuesday and asked employees to talk about when they noticed their racial identity, discuss what unconscious bias is and watch videos in which people of color describe feeling unwelcome in stores. For those of you who don't read the links: you can train off-the-shelf neural networks to recognize faces (or other bits of people and objects) in video clips.
We can now automate the video-photoshopping of subjects so that, for example, folks like me don't look as unattractive in a talking-heads TV interview.
He challenged research subjects to connect all nine dots using just four straight lines without lifting their pencils from the page.
Today many people are familiar with this puzzle and its solution.
Everything becomes deniable, and in an age of state-sponsored infowar waged in social media it'll be trivially easy to discredit .
The political consequences of this toxic metastasis of "false news" I leave for discussion in comments. For a while now there's been a very weird phenomenon on You Tube, whereby popular childrens videos are pirated, remixed, and reuploaded as advertising delivery vehicles.
(Second generation application: Hitler sums it up, now with fewer subtitles) There are innocuous uses, of course.