Category Archives: Human Behavior

Aaron Sorkin Dramatizes ‘The Trial of the Chicago 7’ – Shepherd Express

One of the fears going into 2020 was that the Democratic National Convention would become a replay of 1968only this time with right wing agitation stoking the violence instead of left-wing protests. As it happened, this years DNC went virtual but 50-plus years later the argument over 1968 continues. Did the mayhem that erupted in Chicago around the DNC frighten middle-class voters, ushering Nixon into the White House on a law-and-order platform?

And in the words of William Kunstler to Tom Hayden in The Trial of the Chicago 7, Who started the riot, Tom? Writer-director Aaron Sorkin (Moneyball, Steve Jobs) poses the situation from many angles, apportioning responsibility across many hands and weighing the different varieties of idealism. Sorkin doesnt tell us what to think as much as give us things to think about.

The cast is tops, playing recognizably believable versions of people who were notorious in their day. The prominent defendants in the federal case brought against several DNC protest leaders by the Nixon administration in 1969 were a mixed bag with Abbie Hoffman (hilariously channeled by Sacha Baron Cohen), Jerry Reuben (Jeremy Strong), Tom Hayden (Eddie Redmayne), Rennie Davis (Alex Sharp), David Dellinger (John Carroll Lynch) and Bobby Seale (Yahya Abdul-Mateen II). Dellinger is the odd man out, a middle-class dad and scoutmaster who happens to be a pacifist. The others are young, and Sorkin neatly positions them across the political spectrum: Davis is an idealist whose only commitment is to end the killing in Vietnam. Hayden wants to change the system by working within the system. Hoffman, a prankster who thinks the system is absurd, calls for a cultural as well as a political revolution. His friend Reuben is even more fervid, teaching bomb-making classes in preparation for storming the citadel.

Seale, a Black Panther leader, stands in a class by himself. Dr. King had a dream. Now he has a bullet in his head, he says. He expects no mercy from the system. Hes not playing games.

Except for Seale, the defendants are represented by star attorney William Kunstler, played with world-weary, street (and court)-wise pragmatism by Milwaukee-reared Mark Rylance. In Sorkins screenplay, Kunstler is the anchor that keeps his unruly clients more or less in line. Sitting across from them on the dais is Federal Judge Julius Hoffman (Frank Langella), speaking in orotund patrician vowels and boding no dissent.

Abbie Hoffman and Reuben are determined to monkey-wrench the proceedings. They wear judicial robes to court and when ordered to remove them, strip to reveal police uniforms. Contempt of court citations stream from the bench. Seales attempts to plead his case are struck down with force. Eventually he is carried into the courtroom bound and gagged and chained to a chair. Sorkin suggests that Judge Hoffman was senescent, unaware that heas much as Abbie Hoffman and Reubenis turning the trial into a spectacle.

Sorkin repeatedly flashes back from the courtroom to events on the ground near the DNC. Told to give no quarter, Chicago Mayor Richard Daleys police are ready to break heads. Most of the protesters think they are making a statement on the war for convention delegates and the hovering network news cameras, but a combination of irresponsible parties, unanticipated circumstances and the heat of events sparked the clashes that remain synonymous with Chicago 1968.

With a masterful understanding of human behavior, Sorkin manages to endow all the leading figures with sympathy except perhaps for oneNixons blunt-speaking Attorney General John Mitchell, who orders the prosecution for reasons that have less to do with law enforcement than with politics and his own petulance.

To read more film previews and reviews,click here.

To read more articles by David Luhrssen,click here.

David Luhrssen lectured at UWM and the MIAD. He is author of The Vietnam War on Film, Encyclopedia of Classic Rock, and Hammer of the Gods: Thule Society and the Birth of Nazism.

Nov. 16, 2020

9:30 a.m.

View post:
Aaron Sorkin Dramatizes 'The Trial of the Chicago 7' - Shepherd Express

Learning action plans: The no cost way to increase learning transfer – Training Journal

As a learning professional, the end goal of your learning interventions is not just to improve the learning of participants, but to elicit application of learning back on the job. The process of taking what is learned in a training intervention and applying that learning to an on-the-job setting is known as learning transfer.

It is only through this process that L&D interventions can be said to have made a measurable difference to an employees performance. Yet, it is surprisingly rare. Authors disagree on the exact success rate but estimates suggest that only 10-22% of training leads to learning transfer.

This article will outline an easy to implement method that can increase the amount of learning transferred to the role with no set up costs. To understand why there is little transfer, let's take a look at what learners usually do after training.

The noise

When learners return to their work following a typical training event, the first thing they do is check their email, catch up on social media, or chat to their team about what they missed during the training. One thing's for sure, they arent thinking about what they can do to turn their learning into tangible improvements on the job.

A poorly defined goal is hard to achieve; without definition its hard to know if progress towards the goal has been made.

This influx of information; email, social media, side conversations, scrambles the message the training tried so hard to deliver. The training message now has to compete with office gossip, critical updates from teammates and managers as well as the addictive infinite stream of email and social media for the slim chance of being transferred to the long-term memory of the learner.

Good luck.

However, as a learning professional, there is something you can do to boost the signal of your training message and increase post-training follow through: learning action plans.

Learning action plans, a learner-generated plan to implement their learning following a training event, can keep the training message clear, and cut through the noise of the post-training onslaught of sensory information to increase the chances of learning implementation and training transfer.

Research on effectiveness of learning action plans

Learning action plans have been shown to increase attention in training and improve performance scores on trained behaviour post-training as well as result in higher goal achievement acrossa number of domains.

Learning action plans work by:

Key factors to consider when creating your own learning action plan

Clarity

Learners need to be clear what exactly they want to achieve. A learning action plan should prompt the learner to reflect on their learning and define the elements they will take ownership of for seeing through into action. A poorly defined goal is hard to achieve; without definition its hard to know if progress towards the goal has been made.

Desire

Understanding why we want something helps us to formulate in our mind how we get it. Why are you reading this article? Chances are you want to improve the effectiveness of your training, or you want your learners to learn more, or you are dissatisfied with the effectiveness of training in general.

You have a burning desire to improve your training and are looking for practical implementations. With your desire framed clearly in your mind, you are more likely to take steps to follow through and implement learning action plans. Good! Read on.

Support

Human beings are social animals, we crave interaction with others. Organisations are living organisms, constantly changing and teeming with life and energy generated by the countless interactions made by its people throughout the day.

People help others and require help from others. Its the same with learning. What help do you need to action your plan? Do you need additional information, if so from who? Maybe you need to check your progress regularly with a colleague or manager? Do you need your manager to help expose you to more situations where you will use your new skills?

Maybe you need time to practice a new skill; and the support of your manager to make this time?

Action

Now that youve defined what, why, and who, its time to define how you are going to do it. What specific actions are you going to take and in what situations will you take them? Visualize yourself in that situation following through on your plan.

For example, following a communication skills course: time management training: I am going to block 15 minutes at the start of my day to plan the rest of my day in 30-minute chunks. If I do not complete my task in the allotted 30 minutes, I move to the next task/30 minutes regardless.

Towards the end of your training, ask your learners to take a few moments to think about what they want to change about their behavior based on what they learned in training. Ask them to commit this to paper along with a timeline of when they will do this by, and any support they will need from managers/colleagues to achieve their goal.

Use the template below torapidly kick-start learning transfer after your next training event:

Learning action plan template

Key takeaways

The learning action plan is possibly the most psychologically important tool in your learning transfer toolkit, it leverages decades of human behaviour research in the areas of accountability, social commitment, and goal setting.

Learning action plans are easy to set up, use the template above and add to the final slide of your training deck, or use the template to create a survey to be sent electronically to learners on completion of self-directed learning modules.

Want to boost the chances of learning transfer happening? Encourage your learners to send their action plan to their manager and prompt a discussion about their learning and the support required to turn learning into performance.

About the author

Fergal Connolly is a learning transfer expert and holds an MSc in Education and Training, and a BSc in Psychology.

Continued here:
Learning action plans: The no cost way to increase learning transfer - Training Journal

Male Science Fiction Movies are About Men Having a Romance with Their AI Women: Shalini Kantayya on Coded Bias – Filmmaker Magazine

A.I. and machine learning models can decide who is accepted into college, who gets housing, who gets approved for loans, who gets a job, what advertisements appear on our social media and when. The extent of what A.I. dictates in our lives, and how, is unfathomable to us because it is essentially unregulated, yet we have accepted these invisible systems into our lives with incredible faith and speed. We trust the algorithms, assuming their mathematical functions lack the ability or will to hurt us. But activist and filmmaker Shalini Kantayyas film Coded Bias shows us how these systems will always be, for better and for worse, reflections of the people who made them. Algorithms and A.I., Kantayya reveals to us, are prone to recreating and even automating our worst human biases.

With no government regulation, algorithms and A.I. that discriminate against women and people of color can freely enter the world. When these biases in the system are discovered after theyve already been implemented, programmers and the companies who employed them are not at fault. It is written off as a technical issue the programmer did not intend; there are no legal repercussions. But as is typically the case in many fields, Black women are at the head of the charge for improvements in the tech industry. Computer scientist and poet Joy Bualomwini (who calls herself a poet of code), started the Algorithmic Justice League when she discovered her facial recognition software wasnt recognizing her face because it was biased towards white complexions. She ends the film testifying before Congress and scaring Democrats and Republicans alike with the dangers of unregulated tech.

Another reason these algorithms are so untouchable is because the language surrounding them is abstruse and its functions hardly ever transparent to consumers. Kantayya tells us how she streamlined this technobabble with Coded Bias so that we can understand the havoc A.I. wreaks every day in the background of our lives. In doing so, the algorithms feel much more manageable and maybe a little less horrifying because of that.

Filmmaker: As the election came down to the difference of just thousands or hundreds of votes, it is hard not to think about what Coded Bias shows about social medias potential to sway elections one way or the other.

Shalini Kantayya: Zeynep Tufecki recites this Facebook study that was published in Nature magazine in 2010 that shows the difference between the small, incremental change of showing your friends faces with an I Voted sign that Facebook implemented [versus] not showing their faces. They found out that Facebook could sway in excess of over 300,000 people to the polls. Basically, what that goes to show is how these imperceptible changes in the way the algorithms work can have very real outcomes on human behavior.

Filmmaker: The doc also shows how unregulated tech is a rare mutual fear between Democrats and Republicans.

Kantayya: AOC, left liberal from Queens, is agreeing with Jim Jordan, conservative Republican from Ohio. Theres this scene where Jim Jordan says, Wait a minute. 117 Million people are in a police database that [police] can access without a warrant, and theres no one in elected office overseeing this process? That was a rare moment where I hoped both sides of the aisle could see the issue.

Filmmaker: How do you start into this? Is your shooting schedule the first skeleton of the project?

Kantayya: No. I couldnt talk to people at parties about what I was working on because it was so hard to describe. I think I started with a few core interviews, maybe four, and from that process fell down the rabbit hole, went deeper and deeper into the story and built the arc from that. I think it wasnt until when Joy went to D.C to testify before Congress that I had a documentary. I had a beginning, middle, end and the character had gone on a journey. [laughs]

Filmmaker: Who or what decided when your shoot ended?

Kantayya: I am the person that decides. I think getting to Sundance was a big marker for the film being finished. Im so grateful that we made it in time for the premiere, because it pushed us do so much work in a short amount of time. But I think the film wasnt really finished at Sundance. We were supposed to play at SXSW, which I wish we could have done, but I dont think the film was finished until June when we finished it [for] the New York Human Rights Watch Film Festival. This was the directors cut, and I did feel the difference in how that cut of the film was received.

Filmmaker: This film is a brisk hour twenty, and this is the kind of film whose goal is to get in front of many people as it can.

Kantayya: I thought a lot about how to make the film palatable. It has a lot of dense subject matter and it was such a rigorous edit in so many ways. But it was really important that the film was palatable, and we made some really hard choices. There are so many gems on the cutting room floor, and I was one of the editors. I was committed to making a film that you want more of.

Filmmaker: How much or little of an expert do you have to become to facilitate this best to a mainstream audience?

Kantayya: I still have this humility speaking about it. Ive now spoken to Stanfords Human Centered A.I. Institute. Ive spoken to some really astute engineers and its always very humbling to me. [laughs] The cast in Coded Bias are some of the smartest people Ive ever interviewed; I think there are 7 PhDs in the film. They have advanced degrees in mathematics and science. But I hope the film levels the playing field. When I was at Sundance, someone at Google said, Weve been having this conversation internally and your film made it a conversation we can have for everyone. I hope the film makes people feel that they dont need a degree from Stanford to understand the technologies that will impact civil rights and democracy, our lives and opportunities in real ways.

Filmmaker: Because so much of the language in these interviews can be abstruse, how much do your editors also have to become experts to even begin to know what theyre working with and how?

Kantayya: I edited a lot of the scene work and big structural work with the interviews. They were so rigorous and dense, so I did a lot of that work myself. My editors, Zachary Ludescher and Alex Gilwit, effortlessly work between editing and special effects. They have this incredible ability to work between mediums. There were some scenes we werent sure would work until the special effects were roughed in. So, I was really lucky to have two editors that were really astute at special effects as well.

Filmmaker: You highlight hero Joy, the primary subject of the film and part of the Algorithmic Justice League, by sometimes shooting her in slow motion.

Kantayya: I feel like one of the most beautiful things about documentaries is that they make heroes out of real people. I was happy to shoot that in a very stylized way. In a documentary where theres so much beyond your control, Im always grateful when theres a chance for me to control some of the elements.

Filmmaker: Did getting funding for a doc ever feel daunting and undoable to you?

Kantayya: Documentary has had a ladder, I think. Coded Bias is 100% funded by foundations. In the beginning, I went through the front door with applications. Im not a filmmaker that has enough rich friends and access to capital, so I did it through the foundation route.And I did build a career like that, through small grants, always trying to overdeliver until I got the reputation to do bigger grants. I dont think its the easiest path, but it is a path thats open to everyone. Limitations define your creativity, and you have to work with what you have. But Im happy when they compare this film to The Social Dilemma, because it was certainly made at a different scale.

Filmmaker: You were one of the last films to get a proper, theatrical, festival premiere when you premiered Sundance.

Kantayya: Getting to premiere at Sundance is amazing under any circumstances. We didnt know it was going to be the last [in person] film festival for years. [laughs] But Im also grateful because it informed how we reedited the film. Getting to watch the film with an audience, and feel them with the film or feel for moments when I may have lost them, really informed how we reedited Coded Bias. Like every filmmaker, I miss the movie theater, and were just trying to reinvent ourselves in this new environment.

Filmmaker: Early in the film you show a montage of science fiction films to show how the tech industry aims to manifest the tools and futures those films envisage. I realized all of the films you show were directed by white men, so the bulk of the visions of the future we try to manifest is a future predominantly envisaged by white men.

Kantayya: What I learned in the making of Coded Biasis that theres always been this conversation between science fiction writers and technology developers. Marvin Minsky at MIT labs was in conversation with Arthur C. Clarke and was the one who actually made HAL in 2001. What I feel is that both technology developers and science fiction artists have been limited by the white male gaze. Its something we talk about in cinema with Hollywood so White and other movements. I think that can restrict imagination. Joy and I were joking that a lot of these male science fiction movies are about men having a romance with their A.I. women, including some of my favorite films like Blade Runner. [laughs] We also geeked out about what science fiction by women would look like. But I hope Coded Bias unleashes the genius of the other half of the population and stretches our imaginations. I think by recentering the conversation on women and people of color, who happen to be the ones leading the fight on bias in A.I., it shifts our imagination about what these technologies can be.

Filmmaker: Can you talk about building the arc of the A.I. narration that begins the film clean and objective and becomes distorted, more biased, and eventually racist and misogynistic over time?

Kantayya: I was constantly thinking about how to keep a cohesive narrative structure when there are so many storylines and geographies. Through research I discovered Tay, a real chatbot that became an anti-Semitic, racist, sexist nightmare. [Tay was a chatbot designed by Microsoft and released on Twitter, that learned from Twitter users to post inflammatory racist and misogynistic tweets and was shut down within 16 hours of its launch] Half of the film uses the voice of Tay and its actual transcripts from the Taybot. Then, about halfway through the film, the voice of the chatbot morphs. Tay dies and it becomes this other voice, which is a womans voice that eerily sounds a bit like Siri. Thats written narration. The A.I. as a narrator was a device inspired by 2001 that comments on what the HAL of today is. [laughs]

We have to tell people that the A.I. narration is a reference to HAL from 2001: A Space Odyssey, because it became known to me that a lot of young people have not seen 2001. [laughs] Oh my god! It was only through showing my film to high school kids. I did a hands up to see how many had seen it.

Filmmaker: And there were zero hands up?

Kantayya: Yeah. [laughs] So I was like, basically you didnt get the reference.

Filmmaker: Another idea in the film, is that this tech is just a reflection of us. It is not this separate and magical thing as its been imagined in pop culture, it has inherited all its programmersweaknesses and biases. The difference is that those biases are automated, or that theres no human element where the algorithm questions if its wrong.

Kantayya: Human bias can be coded and we all have it. We often dont realize it. Steve Wozniaks wife got a different credit score than him on the Apple Credit Card and he was like, How can this be? We have all the same money, all the same assets, everything. It could be because women have a shorter history of credit in the US, or a shorter history of having mortgages. But the computer was somehow picking up on historical inequalities, and the programmer didnt know that, so its an example of unconscious bias. A similar thing happened with Amazon. They installed a sorting system for resumes and were like This is great, this is going to undo the human bias that we all have, and lo and behold the A.I. system is picking up on who got hired, who got promoted, who had job retention in the past and it discriminated against any candidate that was a woman. It had the exact opposite impact. It just goes to show that even when the programmers have the best of intentions, the A.I. can pick up on unconscious biases and historic inequalities.

Filmmaker: Finally, I just want to confirm, the working title for this film was Racist Robots?

Kantayya: [laughs] I tested it. I loved that title so much! [laughs] But people wouldnt let me keep it.

See original here:
Male Science Fiction Movies are About Men Having a Romance with Their AI Women: Shalini Kantayya on Coded Bias - Filmmaker Magazine

The great reset meets the Internet of Bodies: manipulating human behavior with authoritarian surveillance – The Sociable

As the networking of humans and machines shows to have incredible promise towards improving overall health and well being for generations to come, the Internet of Bodies (IoB) also runs the risk of enabling a global surveillance state, the likes of which the world has never seen.

The Internet of Bodies might trigger breakthroughs in medical knowledge []Or it might enable a surveillance state of unprecedented intrusion and consequence RAND Corporation report

Following the launch of its great reset agenda, the World Economic Forum (WEF) made a push for the global adoption of the IoB, which risks enabling an authoritarian surveillance apparatus that can manipulate human behavior to achieve its desired outcomes.

According to a recent RAND corporation report, the IoB might trigger breakthroughs in medical knowledge []Or it might enable a surveillance state of unprecedented intrusion and consequence.

The IoB ecosystem is part of the Fourth Industrial Revolution that the World Economic Forum (WEF) wishes to harness for its great reset agenda.

One silver lining of the pandemic is that it has shown how quickly we can make radical changes to our lifestyles [] Populations have overwhelmingly shown a willingness to make sacrifices Klaus Schwab, WEF Director

Conceived over five years ago and launched in June, 2020, the so-called great reset agenda promises to give us a better world of more sustainability and equity if we agree to revamp all aspects of our societies and economies, from education to social contracts and working conditions.

Such radical changes would require a complete shift in our thinking and behavior, and what better way to modify our behavior than to monitor every move we make through a connected network of digital tracking devices?

According to RAND,Greater connectivity and the widespread packaging of IoB in smartphones and appliancessome of which might collect data unbeknownst to the userwill increase digital tracking of users across a range of behaviors.

Increased IoB adoption might also increase global geopolitical risks, because surveillance states can use IoB data to enforce authoritarian regimes RAND Corporation report

The WEF is fully behind widespread adoption of the IoB despite recognizing the enormous ethical concerns that come with having an unprecedented number of sensors attached to, implanted within, or ingested into human bodies to monitor, analyze, and even modify human bodies and behavior.

Its now time for the Internet of Bodies. This means collecting our physical data via devices that can be implanted, swallowed or simply worn, generating huge amounts of health-related information Xiao Liu, WEF

Knowing that the Internet of Bodies can be used to control human behavior while gaining access to the most sensitive health, financial, and behavioral data of every person on the planet, the Davos elite urgesstakeholders from across sectors, industries and geographies to work together to mitigate the risks in order to fully unleash the potential of the IoB, according to a WEFreport from July, 2020.

After the Internet of Things, which transformed the way we live, travel and work by connecting everyday objects to the Internet, its now time for the Internet of Bodies, wrote Xiao Liu,Fellow at the WEFs Center for the Fourth Industrial Revolution.

This means collecting our physical data via devices that can be implanted, swallowed or simply worn, generating huge amounts of health-related information.

If you think that the idea behind contact tracing apps is just for tracking people infected by viruses, think again

But while having access to huge torrents of live-streaming biometric data might trigger breakthroughs in medical knowledge or behavioral understanding, the RAND Corporation warns that the IoB could also enable a surveillance state of unprecedented intrusion and consequence.

According to RAND, Increased IoB adoption might also increase global geopolitical risks, because surveillance states can use IoB data to enforce authoritarian regimes.

For example, this is the same ecosystem that is allowing the Chinese Communist Party (CCP) to collect DNA data from its Uyghur population, so the authoritarian regime can further spy on, imprison, and sterilize an entire ethnic minority, among other horrible atrocities.

But if you want to see how the IoB fits into a great reset, like the one the WEF is touting, look no further than Chinas social credit system that uses enormous amounts of aggregated data, including health records, on individuals to determine their trustworthiness and to incentivize desired behaviors, according to RAND.

A population that knows it is being watched will change its behavior to conform to the norms, and its citizens will police themselves.

Thus, the IoB is a tool that can serve multiple purposes it can revolutionize healthcare for the benefit of all; it can be used to monitor, track, and prevent global crises before they manifest, and it can be turned into an apparatus for manipulating human behavior in order to achieve the desired outcomes of the global elite.

Contact tracing is also a tool for complete social control, keeping tabs on a nations so-called deplorable or undesirable citizens.

Think social justice policing via contact tracing not just through mobile phones, but tracking chips implanted in the human body.

Today, the WEF is fully behind the use of the IoB, and actively supports digital health passports (CovidPass) and contract tracing apps (CommonPass).

If you think that the idea behind contact tracing apps is just for tracking people infected by viruses, think again.

The same technology was used by the CCP to develop an app that literally alerts citizens with a warning when they come within 500 meters of someone who is in debt, according to theWEF Global Risks Report 2019.

The app has created whats essentially a map of deadbeat debtors, according to Chinese state media, and shows you the debtors exact location, though its unclear if the displayed information includes a name or photo.

So, while the WEF urges greater IoB use and contact tracing, the technology is not just for tracking the spread of a virus.

Contact tracing is also a tool for complete social control that keeps tabs on a nations so-called deplorable or undesirable citizens.

Think social justice policing via contact tracing not just through mobile phones, but tracking chips implanted in the human body.

Widespread IoB use might increase the risk of physical harm, espionage, and exploitation of data by adversaries RAND Corporation report

The RAND report also warned that widespread IoB use might increase the risk of physical harm, espionage, and exploitation of data by adversaries.

You no longer need to be MI6 and issued a Walther PPK in order to assassinate someone; you just need to gain access to their medical devices Richard Staynings, Cylera

Indeed, if state-sponsored hackers or criminal organizations were to gain access to a medical device used by a high-profile target, the hackers could simply switch it off and assassinate their target.

As Richard Staynings, Chief Security Strategist at Cylera, once told The Sociable,You no longer need to be MI6 and issued a Walther PPK in order to assassinate someone; you just need to gain access to the medical devices that are keeping that individual alive.

On top of the geopolitical risks, the RAND report warned that the IoB could also increase health outcome disparities, where only people with financial means have access to any of these benefits.

However, this seems an unlikely scenario because the WEF doesnt like to see one nation gain too much power. It prefers balance. It wants every country to follow the rules. It wants a technocratic Utopia.

Authoritarianism is easier in a world of total visibility and traceability, while democracy may turn out to be more difficult WEF report

As such, the WEF would like to see the IoB regulated uniformly across the globe, and the Davos elite routinely call for its ethical governance, but that doesnt mean the surveillance would go away.

Not at all.

It just means that everybody would be spied on equally after having consented to the Draconian measures dressed-up as serving the greater good.

At its heart, the IoB is dependent upon collecting tons of biometric data, which will allow new forms of social control, according to the WEF Global Risks 2019 report.

The WEF concluded two years ago that authoritarianism is easier in a world of total visibility and traceability, while democracy may turn out to be more difficult.

Now, the WEF wants to exploit the Fourth Industrial Revolution under the great reset agenda, and it has massive support from the media, world leaders, and captains of industry alike.

Klaus Schwab, founder and director of the WEF,had already called for the great reset back in 2014(see video above), but decided in June, 2020, that this was the year to enact the scheme because the coronavirus crisis had presented a rare but narrow window of opportunity.

And in order to make the Davos elites globalist Utopia a reality, universal trust in the increasingly invasive uses of emerging technologies will be required.

If you are willing to believe that a global, un-elected body of bureaucrats based in Switzerland has your best interest at heart, then you are willing to accept that your corporeal autonomy, physical privacy, and mental freedom may be compromised to serve the greater good.

A skeptical look at the great reset: a technocratic agenda that waited years for a global crisis to exploit

Hackable humans can become godlike or fall to digital dictators lording over data colonies: WEF insights

Authoritarianism is easier in a world of total visibility: WEF report

Tech arms race will give corporations, governments the ability to hack human beings: Yuval Harari at WEF

Medicine or poison? On the ethics of AI implants in humans

Originally posted here:
The great reset meets the Internet of Bodies: manipulating human behavior with authoritarian surveillance - The Sociable

Human behavior, cross-cultural belief systems, and the color of yellow – Connect Savannah.com

Laney Contemporary and Atlanta-based artist, Jiha Moon invite you to mirror a pop-cultural artist introspective, Lucid Yellowwith more than thirty piecesof exciting and colorful gestural paintings, prints, ceramics, and mask installations on display from Nov. 13 to Jan. 23 2021. A socially-distant, artist opening reception will be held on the lawn Nov. 13 from 4-9 p.m.

My work is always influenced by my life. I feel like my life and my art cannot be separated. In this exhibition, you will see a lot of those emotions coming through. With my techniques, color choices, I try to communicate with the viewers often, you can see metaphors in the work. Im hoping to provide an opportunity to experience that, says Moon.

As a Korean-American, Moons ideation of Lucid Yellow explores bold cultural stereotypes of the color yellow in America, and what it means to be an American.

There are underlying themes that I am always interested in. How do we define Americans? So, when people talk about Americans, am I included or not? And that has been an ongoing conversation for a long time, my entire art career is (kind-of) based on that since I came to the US. I have my family here. My life and art are here, and its inseparable to me, adds Moon.

Human behavior and cross-cultural belief systems such as religion, talisman, shaman rituals, and cultural symbols allude in her work. The newest addition to her collection of iconic imagery, among signature peaches and Twitter birds, is the evil eye.

I want people to make eye contact with my work and the work is looking back at you. This is really the evil eye concept. You know, the evil eye is so evil you could be cursed but you could also protect yourself with evil eyes. I like that idea that has both meanings of protection, and dangerous curses, and mythology, she explains.

Last December, Laney Contemporary presented Moon in a solo booth at NADA/Miami 2019. Other notable shows this year include State of the Art 2020, at Crystal Bridges Museum in Bentonville, Arkansas, and a group show (featuring important southern artists) at Halsey Institute in Charleston, South Carolina. Jiha Moon is currently being represented by Laney Contemporary. Ive been a fan of Jihas work for a while so were really excited to be representing Jiha now., says, director Susan Laney.

Go here to see the original:
Human behavior, cross-cultural belief systems, and the color of yellow - Connect Savannah.com

How to Leverage the Cloud to Create a Modern Culture of Data – SPONSOR CONTENT FROM SLALOM – Harvard Business Review

Renowned psychologist Albert Bandura once wrote, Learning would be exceedingly laborious, not to mention hazardous, if people had to rely solely on the effects of their own actions to inform them what to do.

Imagine a world like that, where peoples only data source is firsthand experience and the only way they can learn is the hard way. It is a world in which knowledge is not shared, and culturecomprising the collective beliefs, values, goals, and practices that guide peopleeffectively does not exist.

The real world is mercifully different. Fortunately, Bandura went on to explain, most human behavior is learned observationally through modeling: from observing others one forms an idea of how new behaviors are performed, and on later occasions, this coded information serves as a guide for action.

Data-driven organizations transcend the definition of others; their members learn not only by observing other humans but also by tracking and analyzing large quantities of data. To truly improve based on that data, members must have a common understanding of what the data means, why it matters, and what should be done with it, along with the culture that surrounds it. In this way, a data-driven organization cannot succeed without a data-driven culture.

At Slalom, we call this kind of culture a Modern Culture of Data, and it contains five key elements:

The Promise of the Cloud: Guardianship and Access & Transparency

Cloud services help foster a Modern Culture of Data, especially with respect to the elements Guardianship and Access & Transparency. Aspects of both appear as major drivers of cloud migration in this IDC research paper. In the words of one IT architect at a manufacturing organization quoted in the paper, We needed to get away from our on-premises environment for a variety of reasonsmostly for data securityand we were long overdue for this move because our servers were breaking regularly. Enough server breakdowns can erode trust in data, tools, and practices. These and other Guardianship-related challenges turn many companies to the cloud to add greater functionality and security to their IT systems.

In terms of how cloud services support Access & Transparency, its largely a matter of scale. IDC found that the most-reported trigger event that leads to public cloud migration occurs when, to cite its survey, data has grown beyond the capacity of our existing systems. Concerns about scalability and security with its on-premises infrastructure are what drove Avis Budget Group to build its next-generation transportation platform on Amazon Web Services (AWS). Expanding upon the AWS connected vehicle solution framework, Slalom helped Avis develop a machine-learning-powered optimization engine that addresses the overutilization and underutilization of cars in real time. Avis estimates that optimizing car mileage across its fleets will save the company tens of millions of dollars. The insights from the new engine will affect how work is done at every level of the organization, which is also a prime example of Ways of Working and embedding insights into everything you do.

According to Christopher Cerruto, the vice president of global architecture and analytics at Avis, By building on AWS, were able to begin realizing our vision of building a full transportation platform at a pace thats 10 times faster than what we had imagined.

Digital Transformation and Banduras Other Theory

Reflecting on the progress of the Avis platform, Cerruto says, You feel, for the first time, like you are limitless. Thats the feeling made possible by a Modern Culture of Data. It also evokes another concept from Banduras body of work: guided mastery.

Guided mastery is the deliberate journey from limited to limitless. Banduras most famous experiment with guided mastery eases people with lifelong snake phobias through a five-step process of overcoming their fears. In one step, participants watch someone else touch a snake (and live to tell the story). They then proceed to touch the snake themselves while wearing heavy leather gloves. Stanford professor David Kelley, who won the National Academy of Engineerings Gordon Prize in 2020, says in a TED Talk that Bandura once described to him how some participants even ended up marveling out loud at the beauty of the snakes as they held them in their laps.

Bandura attributes the success of the experiment to the increase in confidence that each step affords. Belief in ones ability to navigate prospective situations is what Bandura calls self-efficacy and what Kelley calls creative confidence. And its not just about snakes. Kelley won the Gordon Prize for almost single-handedly transforming the way that engineers are educated.

Self-efficacy and the process of guided mastery form the basis upon which Slalom and AWS created a joint approach to digital transformation, one that helps propel customers forward into the futures theyre envisioning as individuals, teams, and organizations. Digital transformation is both inextricably linked to a Modern Culture of Data and enabled, accelerated, and elevated by cloud technologyif you know how to harness it.

To learn more about AWS | Slalom Launch Centers, visit slalom.com/aws-slalom-launch-centers.

Read more from Slalom:

Read the original here:
How to Leverage the Cloud to Create a Modern Culture of Data - SPONSOR CONTENT FROM SLALOM - Harvard Business Review

Movies with Mary: Big brother really is watching – Alton Telegraph

Movies with Mary: Big brother really is watching

The Social Dilemma is a documentary about social media, airing on Netflix, that may scare the puddin out of you if you can watch it all the way through. Either that, or you will be bored and pick up your phone and turn to Facebook, Twitter or Instagram.

Twenty-three executives, engineers, and designers from Facebook, Instagram, Google, YouTube, Foxfire, Twitter, Snapchat, etc. talk about social media, how it was designed and why it was designed, and what it has become.

At first, most of the social media websites were created to give people a way to connect with family and friends and share information, but as time went on, it became more of a market to trade on human futures. It set up algorithms to predict human behavior and to manipulate it, according to this documentary.

Everything you do is being tracked and recorded to build models that predict your behavior. The models also manipulate us to change our behavior without us even knowing they are doing it. Social Media is an addiction, just like alcohol, gambling and drugs.

People have become so addicted to social media that they arent aware of how much time they spend online. There are only two industries that call their customers users: drugs and software. Social media is a drug.

Since the advent of social media, suicide rates and self-harm of young women has skyrocketed and bullying has increased, according to this documentary.

In the last few years, our country has been divided more and more politically. Social media has played an important part in this because of the information received, it reinforces your beliefs. We are being manipulated. We receive only the news we want to see and read that instead of the truth, regardless of which party we support. If you are a Republican you receive only news that supports that point of view, and if you are a Democrat, you only receive news that supports that point of view.

At this point, social media is not regulated.

Directed by Jeff Orlowski, The Social Dilemma was written by Orlowski, Vicki Curtis and Davis Coombe.

It starts off very, very slow. Psychiatrists, former executives, computer designers each talk about what social media is doing to manipulate your behavior and why. The why is trillions of dollars annually. If you stay with it, it will scare the heck out of you. It seems that George Orwell was just about 40 years off.

Big Brother is watching!

Movie critic Mary Cox lives in Wood River and studied film at the University of California, Los Angeles. She has worked in L.A. with various directors and industry professionals. Contact Mary at mary.cox@edwpub.net.

Go here to read the rest:
Movies with Mary: Big brother really is watching - Alton Telegraph

Movies with Mary: ‘Big Brother’ is watching – Alton Telegraph

Mary Cox, mary.cox@edwpub.net

Movies with Mary: Big Brother is watching

The Social Dilemma is a documentary about social media, airing on Netflix, that may scare the puddin out of you if you can watch it all the way through. Either that, or you will be bored and pick up your phone and turn to Facebook, Twitter or Instagram.

Twenty-three executives, engineers, and designers from Facebook, Instagram, Google, YouTube, Foxfire, Twitter, Snapchat, etc. talk about social media, how it was designed and why it was designed, and what it has become.

At first, most of the social media websites were created to give people a way to connect with family and friends and share information, but as time went on, it became more of a market to trade on human futures. It set up algorithms to predict human behavior and to manipulate it, according to this documentary.

Everything you do is being tracked and recorded to build models that predict your behavior. The models also manipulate us to change our behavior without us even knowing they are doing it. Social Media is an addiction, just like alcohol, gambling and drugs.

People have become so addicted to social media that they arent aware of how much time they spend online. There are only two industries that call their customers users: drugs and software. Social media is a drug.

Since the advent of social media, suicide rates and self-harm of young women has skyrocketed and bullying has increased, according to this documentary.

In the last few years, our country has been divided more and more politically. Social media has played an important part in this because of the information received, it reinforces your beliefs. We are being manipulated. We receive only the news we want to see and read that instead of the truth, regardless of which party we support. If you are a Republican you receive only news that supports that point of view, and if you are a Democrat, you only receive news that supports that point of view.

At this point, social media is not regulated.

Directed by Jeff Orlowski, The Social Dilemma was written by Orlowski, Vicki Curtis and Davis Coombe.

It starts off very, very slow. Psychiatrists, former executives, computer designers each talk about what social media is doing to manipulate your behavior and why. The why is trillions of dollars annually. If you stay with it, it will scare the heck out of you. It seems that George Orwell was just about 40 years off.

Big Brother is watching!

Movie critic Mary Cox lives in Wood River and studied film at the University of California, Los Angeles. She has worked in L.A. with various directors and industry professionals. Contact Mary at mary.cox@edwpub.net.

Read more here:
Movies with Mary: 'Big Brother' is watching - Alton Telegraph

These are the world’s 10 most influential values – World Economic Forum

What do we care about most?

Image: Visual Capitalist

Our basic values can inform ideals, interests, political preferences, environmental views, and even career choices.

With sweeping data covering half a million surveys in 152 languages, Valuegraphics identifies 56 values that influence human behavior. It uncovers what people care most about around the world, through a contextualized dataset.

The 10 most important values

Individual motivations and values are universally organized. That said, research shows that the hierarchy of these values varies significantly.

According to Valuegraphics, here are the top 10 values we share across cultures.

The top ten values.

Image: Visual Capitalist

While it may not be surprising that family emerges as the most important value globally, its interesting to note that a number of other connectedness valuessuch as relationships and belongingemerged in the top 10. Values of loyalty, and religion/spirituality ranked #6, and #7, respectively.

At the same time, security-related values, including financial and employment security, score highly around the world.

From a business and leadership context, values are interesting in that they can guide how people and consumers make their decisions. As people interact with the world, different experiences can engage their most closely-held values.

If you can understand what your target audience cares about, what they spend their lives chasing, now you have an actual chance to use data to understand how to engage and influence and motivate them.

The full list of the 56 most influential values

Covering 401 metrics and 370 questions, how did all 56 values break down within the extensive Valuegraphics database on a global level?

What we care about most, 1-26.

Image: Visual Capitalist

What we care about most, 27-56.

Image: Visual Capitalist

Across nine regions, the value of social standing stood at #17, while environmentalism came in at #36. Interestingly, both values of wealth (#38) and money (#52) ranked lower on the spectrum.

Meanwhile, respect (#15) and compassion (#16) values fell closer to the top.

While many similarities exist across cultures, a number of fascinating differences emerge.

Take morality, for example. Across all regions, it illustrated some of the widest varianceit was the second-most important value in the Middle East, whereas it came in near the bottom in Central and South America. Another notable outlier surrounds the value of patience. The African region placed the value within its top five. By contrast, it ranked globally about mid-way (#26) through the list.

Another fascinating discovery is how both North America and the Middle East ranked the value of authorityboth ranked it equally (#17), significantly higher than the global average of #30. Meanwhile, the value of tradition saw the highest ranking in Central & South America, but the lowest in Europe.

As the world becomes increasingly complex, understanding how values impact our attitudes and behaviors can help us deepen our understanding across several avenues of life. Consumer research, marketing, leadership, psychology, and many other disciplines all fall within the broad spectrum of the influence of what humans value.

See the article here:
These are the world's 10 most influential values - World Economic Forum

Wisconsin On Track To Double Its Total COVID-19 Deaths By Year’s End – Wisconsin Public Radio News

National and state experts say it's very possible thatWisconsin will double its total number ofCOVID-19 deaths before the end of the year, based on predictive modeling.

In a statewide address Tuesday night, Gov. Tony Evers cited an estimate from the Institute for Health Metrics and Evaluationat the University of Washington that Wisconsin could see 5,059COVID-19 deaths by Jan. 1.

As of Wednesday, the state had reported 2,457deaths from COVID-19, an increase of 62 deaths from Tuesday.

Ali Mokdad, a professor at IHME, said many states are facing similar projections. Mokdad said hes confident in the estimate because countries like Argentina and Chile saw similar growth in COVID-19 cases and deaths during their winter months in July, August and September.

"Right now when you look and see in Europe, they are like three or four weeks ahead of us," Mokdad said. "They started peaking, mortality is increasing and we are following, unfortunately, the same pattern everywhere with COVID, because COVID is a seasonal virus."

He said IHMEs model looks at the relationship between various factors measured over the previous eight days, like mortality, testing, mask wearing and population density.

OguzAlagoz, a University of Wisconsin-Madison engineering professor who specializes in modeling the spread of infectious diseases, said many factors, like changing human behavior or the availability of a vaccine, could impact the accuracy of a projection. But he agrees with IHMEs estimate.

"I wish I could say, Oh, they are too pessimistic. But I tend to agree with them that if we see the same patterns, same growth rate, 5,000 deaths is a possible outcome we could see by Jan. 1," Alagoz said.

Alagoz said IMHEs estimate is similar to those based on other models from around the country, giving researchers more confidence that the projection is accurate.

For example, Harvard Medical School's COVID-19 Simulator projects Wisconsin will see 5,460 deaths by Jan. 1.

While estimates of COVID-19 cases and deaths varied widely at the start of the pandemic, Alagoz said most models have become more stable. That's because researchers better understand how the virus is spreading, and health care professionals better understand how to treat COVID-19 patients.

But Alagoz said human behavior in the next several weeks will have a big impact on the actual number of COVID-19 deaths seen in the state.

"Its really very much about individual buy-in because we know that here the transmission appears mostly in private events so like, I come together with a few friends or family members in an indoor space," Alagoz said.

Because IHMEs model looks at data from the previous week, Mokdad said the projection hasnt accounted for how spread could increase if more people gather for the holidays.

He points out the current estimate is already a bad sign for the states hospitals, which are projected to have a shortage of ICU beds and ventilators under IHMEs model.

And Mokdad says even if a vaccine becomes available in early 2021, it will likely be too late.

"It's like the flu vaccine: if you take it in March or April, it's already kind of late. So we really need to put on our best behavior right now. Prevention is what we can do in order to avoid an increase in infection and mortality," Mokdad said.

Read the original post:
Wisconsin On Track To Double Its Total COVID-19 Deaths By Year's End - Wisconsin Public Radio News