Dating sites for bipolar people
I study social movements, so I was studying it, too.And then I wanted to write something about one of his rallies, so I watched it a few times on You Tube.Now, the problem isn't solved if he doesn't publish it, because there are already companies that are developing this kind of technology, and a lot of the stuff is just off the shelf. Do you ever go on You Tube meaning to watch one video and an hour later you've watched 27?You know how You Tube has this column on the right that says, "Up next" and it autoplays something? It picks up on what you have watched and what people like you have watched, and infers that that must be what you're interested in, what you want more of, and just shows you more.It's giant matrices, thousands of rows and columns, maybe millions of rows and columns, and not the programmers and not anybody who looks at it, even if you have all the data, understands anymore how exactly it's operating any more than you'd know what I was thinking right now if you were shown a cross section of my brain.It's like we're not programming anymore, we're growing intelligence that we don't truly understand.
So in 2016, I attended rallies of then-candidate Donald Trump to study as a scholar the movement supporting him.What we need to fear most is not what artificial intelligence will do to us on its own, but how the people in power will use artificial intelligence to control us and to manipulate us in novel, sometimes hidden, subtle and unexpected ways.Much of the technology that threatens our freedom and our dignity in the near-term future is being developed by companies in the business of capturing and selling our data and our attention to advertisers and others: Facebook, Google, Amazon, Alibaba, Tencent.You Tube started recommending to me and autoplaying to me white supremacist videos in increasing order of extremism.If I watched one, it served up one even more extreme and autoplayed that one, too.