Blog

Facebook has manipulated our emotions?

01/07/2014 09:48

Facebook is testing things on us.

 A new study in the Proceedings of the Natural Academy of Sciences (PNAS), an American scientific journal, revealed that Facebook has intentionally manipulated the news flow of almost 700 thousand users to study the "emotional contagion through social networks." 
The researchers, who are part of Facebook, respectively, of Cornell University and the University of California-San Francisco, have made the test of whether reducing the number of updates "positive" in the flow of people who have seen them have reduced their production of positive content. Same thing for the negative ones: it is possible that hide posts that contain words sad or angry it would lead users to write updates less pessimistic? 
To achieve this, researchers have modified the algorithm to select the Facebook post so that the post with positive or negative words were identified and classificiati. Some people were given updates from neutral to positive, while others from neutral to negative. So the next post of these people were studied to evaluate the behavioral significance. 
The result of the experiment? It is a "yes" to the question just above: social networks have the ability to spread positive and negative feelings. Another result: Facebook has intentionally made sad thousands and thousands of people. 
The methodology of Facebook raises a number of ethical issues. The group of researchers may have acted beyond the standard of research, beyond the limits established by federal law and treaties on human rights. James Grimmelmann, a law professor of technology at the University of Maryland, said that "if you expose people to something that changes their psychological status, you are doing the experiment: this is the kind of thing that requires informed consent." 
Ah, the informed consent. Here is the only mention of "informed consent" in this study: the research "has acted within the limits established by the Facebook Data Use Policy, to which all users signify their agreement before they're created an account on Facebook, constituting of made an informed consent for this research. "
It is not at all what most sociologists define "informed consent." 
This is the section in question Facebook's Data Use Policy: "For example, in addition to helping people to see and find the things you do and share, we may use the information we receive about you for internal operations, including the resolution issues, data analysis, testing, research and service improvement. "
Thus, there is a vague mention of possible "research" in the short list of things to which one gives his or her consent when you sign up for Facebook today. As bioethicist Arhtur Caplan told me, however, it is worth asking whether this loophole by lawyers is really enough to warn people that "their Facebook account can be used by every social scientist on the planet." 
Every scientific research that receives federal money must follow the "Common rule on human subjects", which defines informed consent is one thing that includes, among other things, "a description of foreseeable risks or discomforts to the subject." As Grimmelmann observed, nothing in the document suggests that Facebook, the company reserves the possibility of intristirti removing everything that is positive and cheerful from your news feed. The manipulation of feelings is a serious matter, and the constraints to approve a trial of this type are quite demanding. (Princeton psychologist Susan Fiske K., who worked on the story for PNAS, told the Atlantic that the experiment was approved by the oversight committees of the institutions of the researchers involved. But she also admitted to having some qualms about the research). 
Facebook probably has not received any federal funding for this research, which then can not fall under the "Common rule". Leaving aside the fact that following these rules is a common practice for private research institutes such as Gallup and Pew, the question then becomes: Cornell or the University of California-San Francisco have funded these studies? As public institutions, both must obey the law. If you have not yet funded their researchers attended the same, it is not clear what standards should be subject to the research, says Caplan. (I have also contacted the authors of the study, their university and Facebook will update this piece as soon as possible). 
Although the research in question was legal, does not seem to meet the standards required of those who hopes to publish in PNAS. One of the requirements to do so, it says on their website, is that "authors should include as part of the methods used in a brief statement indicating that" the institution and / or the audit committee, which approved the experiment "(the research in question does not contain it). Another requirement states that "I have to be all the experiments were conducted according to the principles expressed in the Declaration of Helsinki." This requires that the same human subjects "are adequately informed of the aims, methods, sources of funding and possible conflicts of interest, institutional affiliations of the researcher, as well as the benefits and potential risks that may result in each study and the discomfort involved." 
During the research, it seems that the social network has made us happier or sadder than we would have otherwise been. Now, has made us all more inclined to not trust him.
 

A post on fb says:

OK so. A lot of people have asked me about my and Jamie and Jeff's recent study published in PNAS, and I wanted to give a brief public explanation. The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.

Regarding methodology, our research sought to investigate the above claim by very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012). Nobody's posts were "hidden," they just didn't show up on some loads of Feed. Those posts were always visible on friends' timelines, and could have shown up on subsequent News Feed loads. And we found the exact opposite to what was then the conventional wisdom: Seeing a certain kind of emotion (positive) encourages it rather than suppresses is.

And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it -- the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.

The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.
 — aFloyd, VA, United States
^Adam D. I. Kramer^ fb post 29 06 2014

The map of where the lightning fall in real time!

01/07/2014 09:26

Virtually every moment there's a bolt of lightning from the sky to ground, or that occurs in the atmosphere moving from one cloud to another. Keep track of all discharges is impossible, but by the time their performance was studied with satellites and other detection systems in order to understand where they are concentrated, in which periods of the year and if they can cause mishaps communication systems or other infrastructure. Systems to keep the lightning under control are expensive, but for some time an ambitious project called "Blitzortung.org" hopes to solve the problem by creating a network of distributed sensing detectors in the area with low cost, the price of which is not above 200 EUR.

Although still incomplete and ever-expanding network of "Blitzortung.org" is already active and allows you to see, in near real time, where the lightning strikes fall in different parts of the world on a series of maps online. Simply put, the various radio receivers pick up the territory interference caused by lightning in the distance and send via the Internet to a data center. A series of algorithms combines the readings of different sensors and, based on the timing and the distance between them, reconstructs the point where the lightning fell. The indication on the map is quite accurate, especially in areas where there is a high number of sensors.

 

Actual situation in Libya

25/06/2014 08:56

Three soldiers was killed and seven others injured today in Benghazi, "the day when Libya elects new parliament. was announced Gouider Khalil, a spokeswoman for a hospital in Benghazi. Clashes - added a source of security services - occurred south of the city between the army and Islamist group. Meanwhile, the seats were closed at 20. results will be announced on Friday and Saturday. Out of a total of 1.5 million eligible to vote, at 17, 30 had gone to vote in 400 thousand, for a turnout of 27%.

For over a month, Islamist groups clash with forces loyal to form General Khalifa Haftar, who launched an offensive to clear Benghazi by extremist groups. On the day of the election Haftar had agreed on a cease-fire to allow citizens to go to the polls to vote for the new Parliament: appointment which he has gone only 45% of enrollments. Always aa Benghazi was murdered human rights lawyer Salwa Bugaighis, the victim of an ambush in front of their homes. The woman in the front row in defense of political prisoners during the regime of Gaddafi, was awarded in 2012 for a premium of 'Vital Voices Global Leadership Awards', a foundation very close to Hillary Clinton. The killing has been defined as an act of "cowardly, despicable and disgraceful against a brave woman and a Libyan patriot" by the U.S. ambassador to Libya Safira Deborah commenting on the killing Salwa Bugaighis was one of the organizers of the demonstrations in Benghazi February 17, 2011 considered the birth of the revolution against Gaddafi. The lawyer was then became part of the National Transitional Council (NTC), the rebel government. After three months, however, came sensational resignation: "They know that women have played a decisive role in the revolution, but now they think that power is to be allocated to men," said controversially, criticizing the lack of women in the bodies of the new Libya. (ANSAmed)

Blog

No comments found.

New comment