Kategoriarkiv: vetenskap

How to study the social sciences, part VI

Time for another piece from my in-the-making dissertation, this time on the blackboxing of statistical measurements. It took more time than I expected, mostly because I had to struggle with statistics :D. Comments by statisticians are mostly appreciated (in Swedish or English). Here we go!

The black boxes of quantification and statistics

So far we have looked very directly at how blackboxing takes place, how it is communicating via interfaces, and how black boxes and interfaces are combined in epistemic assemblages. There is however, another riddle that has to be solved, namely that of quantification and statistics. Once more I shall give an example from the Sociology of Scientific Knowledge, to be more precise of one work of Donald MacKenzie. I will also consider Latour’s notion of centers of calculation, the history of statistics and its epistemic status, and further advance into some more recent research in the sociology of quantification.

Pearson and Yule – A statistical controversy

In his article Statistical Theory and Social Interests (1978), Donald MacKenzie analyzes a controversy, and an emerging breakthrough in statistical methods, that took place during the first one and a half decades of the 20th century, between Karl Pearson and Udny Yule, both regarded as two pioneering statisticians today.

The controversy between Pearson and Yule concerned how to measure association on a nominal scale level. Pearson had in 1905 suggested the tetrachoric coefficient as a solution to how to quantify nominal scales, something which Yule criticized openly during several years (1). MacKenzie’s elaboration on this controversy is interpreted through an analysis of their respective differences in social interests:

/…/ Pearson’s commitment to eugenics played a vital part in motivating his work in statistical theory. Pearson’s eugenically-oriented research programme was one in which the theories of regression, correlation and association played an important part /…/ Regression was originally a means of summing up how the expected characteristics of an offspring depended on those of its parents; the bivariate normal distribution was first constructed by [Francis] Galton in an investigation of the joint distribution of parental and offspring characteristics. (MacKenzie 1978: 53)

MacKenzie’s point is that advances in statistics, even though regarded to be esoteric and mathematically ‘disembodied’, are guided and influenced by a set of social and cognitive interests, that orient the goals and directions of what to develop and what to disregard of. The early 20th century statistics in Britain was thus, at least partially, influenced by a need within eugenics and population control. In Britain, at the time, eugenics and ‘national efficiency’ were regarded as legitimate political options, and were even discussed in government departments. Yule, on the contrary, had no affection for eugenics, and instead argued that heredity was a largely unimportant factor in comparison with environmental ones (MacKenzie 1978: 58-59).

What we have is thus a classical social explanation of how statistics develops in line with the needs defined by group (such as the eugenics movement) interests and larger social interests (for example state governance). What MacKenzie pays less attention to is what happens next:

Contemporary statistical opinion takes a pluralistic view of the measurement of association, denying that any one coefficient has unique validity /…/ Yule’s Q remains a popular coefficient, especially amongst sociologists. Pearson’s tetrachoric coefficient, on the other hand, has almost disappeared from use except in psychometric work. (MacKenzie 1978: 65)

I am not in the position to evaluate whether this is valid or not for statistics in general. What I on the other hand find necessary, is to think the dispersion, usage and effects of statistical methods within the terminology of blackboxing. Back in the early 20th century, many of the core statistical measurements today used for the social sciences were developed, for example the chi-square test, Pearson’s r, advances in correlation and regression by Galton, etc. (see Hacking 1990: 180-188).

Just like the case of the Michelson-Morley experiments, also now deprecated statistical methods, may very well come to reinforce or weaken what black boxes to open and which ones to leave closed. Statistical methods may be blackboxed, taken out of its context of discovery, and applied widely. Or, it may be broken, considered obsolete, or just veiled in historical darkness for other reasons, perhaps only to emerge in the detailed archives of the historian of science.

An example of a ‘successful’ blackboxing is the Pearson r. In a textbook on social scientific methods, which is written by Gothenburg researchers close to the SOM-institute, and taught in many social science classes locally, an interesting passage appears:

The calculation of Pearson’s emph{r} is complicated to say the least /…/ Even though it can be useful to on some occasion make the calculations yourself /…/ – not long ago the researchers had to employ assistants to be able to do these calculations at all /…/ it is of course [today] the computers that calculate Pearson’s r for us.
(Esaiasson et. al. 2002: 392)

The Pearson product-moment correlation coefficient (r) demands a time consuming work and plenty of mathematical skills to calculate manually. Then the first moment of delegation meant to involve more humans to do this work (assistants). And finally, we today have computers doing the same work in milliseconds. A statistician, or social scientist for that matter, must of course be able to master the usage and interpretation of the output of the computer, but in routine work, he or she is able to forget the assistants and hard work it once took to use this statistical tool.

This it is possible to conclude that the statistical measurements, developed in a context of the British eugenics movement, can dislodge from its context of discovery through blackboxing, and find its way into the software packages that are used today in statistical calculation, as standardized measurements and tests to evaluate the quality of survey data. Now, this de-contextualization not only means that it is possible to forget the tedious work that had to be done before computers. It also means that it would be absurd to accuse someone calculating the Pearson’s r for being a follower of eugenics, just as it is equally absurd to accuse someone of militarism for using the internet, just because the internet was originally constructed as a military computer network. For statistics, Latour’s actualist principle still applies: The fate of facts and machines are in later users hands, and their qualities are a consequence of collective action (Latour 1987: 29, see also the above sections on Latour).

But not only are statistics blackboxed as they are assembled as research practices. They also function as interfaces that are able to translate research results into comprehensible facts. The time is ripe to go further along this line, to investigate how especially the modern state has requested such scientific information.

Footnotes:

1. The controversy is much more elaborate than this. To save space, I refer to MacKenzie’s article to be read in its entirety.

flattr this!

How to study the social sciences, part V.

Today was an very good day in my dissertation writing. I managed to scribble down 10k characters and I decided to instantly put them here on interweb. This draft concerns what I will call ”withdrawn hardened functions” in epistemic objects. It is partly influenced by object oriented ontology which I am trying to enrich the standard Science and Technology Studies terminology with. It’s quite a heavy read, so enjoy or surf along!

So far I have only dealt with the positive domains of scientific knowledge; blackboxing, interfaces and assemblages as productive elements. But a core problem in theory of science is what is unknown, what hides in the unconscious or hidden domains of imperceptibility, what is included and what is left behind. This means that in order to deal with this complex issue I need to adapt also a terminology for talking about what evades a concrete epistemic assemblage.

To these questions, at first glance, pure actualism gives little room to navigate. I will propose that the black boxes I encounter have withdrawn hardened functions, which makes them combinable and plastic. This feature has been called ”immutable mobiles” by Latour (1999: 306-307), and it falls close to what Star & Griesemer (1989) call a ”boundary object”. Since as mentioned already before, blackboxing actually is a process of forgetting, embedding and ‘hard coding’ tasks and processes that are needed to produce on a surface level something that is positive knowledge. For example, a pre-compiled dataset of statistical information gathered from a survey makes computerized statistical calculation possible, and quite user friendly compared to doing it manually, precisely because it in a given moment gives us the opportunity to forget thousands of questionnaires and how they were collected and assembled, to instead paying attention to creating bars and diagrams for a scientific report.

This is a wholly different approach than what is the outcome of a Kuhnian thinking. Whereas the process of forgetting is historically very dramatic in Kuhn, I shall argue that it is shallow compared to the sort of object oriented aspect of blackboxing. Lets take a look at a central passage in The Structure of Scientific Revolutions:

In short, they [textbooks] have to be rewritten in the aftermath of each scientific revolution, and, once rewritten, they inevitably disguise not only the role but the very existence of the revolutions that produced them. Unless he has personally experienced a revolution in his own lifetime, the historical sense either of the working scientist or of the lay reader of textbook literature extends only to the outcome of the most recent revolutions in the field. (Kuhn 1996: 137)

This leads Kuhn to thinking about different historical paradigms as incommensurable, and that the disguising of past revolutions leads to a view of scientific progress as being linear and cumulative. But, while these two points are valid and refreshing to the history of science, they are indeed very clumsy for the more close studies of scientific activities. A paradigm, would then appear as a monstrously large black box, where a whole generation of scientists are only able to think ‘within the box’, while the actual workings of the machinery is veiled. Only, according to Kuhn, when enough anomalies appear the scientists start to doubt that the whole paradigm might be wrong.

Two problems arise here. The ‘monstrous’ aspect of Kuhnian historicity leads to a sort of empirical over-determination. In the reports of the SOM-institute we find for example a terminology resembling the sociology of Durkheim, Parsons, Merton etc. The methods of surveys and quantification are also ‘borrowed’ from the intensified usage of these methods in sociology towards the end of the 19th century. Even though this is true on one level, I argue that it adds very little to our understanding what is done, and what that practice means. The abstractness of paradigms, rather ironically, makes the co-production of scientific objects and other objects invisible. To make a crude example (which is unfair to attribute to Kuhn himself); if I read in the local newspaper ”The researchers talk about a Gothenburg effect and a slow norm shift” (as already quoted in the prelude section of this chapter), and then conclude that this is knowledge within a Durkheimian paradigm since it talks about norms and norm shifts, I would instantly remove myself from a process that has significant value for translating the research practice of the SOM-institute into a circulation of facts. The concept of norms is indeed built into theoretical tools used (which in turn may be blackboxed), but if we ignore the relevance of that another actor, the Göteborgs Posten local newspaper, made use of and valued highly enough the much debated question of corruption scandals, the role of science and its interfacing with other societal assemblages is abruptly veiled in darkness, and analysis would stop on what I consider to be a shallow level.

Another more serious flaw in Kuhnian-inspired theories of science is their human-centered character. For science to change, either the scientists need to change their beliefs, theories and everyday practices, or they have to be replaced by a new generation of scientists (1). This is not true for technology, and with technoscience, it is not valid either. Let me give two examples, one simple and one advanced:

Example 1 – The hammer

A carpenter uses hammers (2) as a routine piece of equipment when building houses. It is connected to other objects such as nails, human users, and wooden planks. Hammers are constructed objects, and in one respect they reconfigure the human user too, which has to learn how to use it. One could even say that hammers are paradigmatic technologies of house building, since they imply methods, can be calculated with by architects, etc. Now, the hammer may also be used for committing a brutal murder. Then it becomes a piece of evidence in a murder investigation, is placed in a plastic bag, checked for fingerprints and may even be a technical evidence putting the murderer away for prison for several years. A skilled carpenter knows the difference between a good and a bad hammer, but in the moments of driving nails into wood his or her attention lies elsewhere than with the technological advances, means of production, and the price of the hammer. It is precisely because it is blackboxed, that it may withdraw from full inspection and reflection, that is is a powerful tool. As the house is completed, and populated with new people, they in turn do not need to know anything about hammers, even though they may be ‘implicated’ in the house, and need to be brought forth once again as the house is repaired. The hammer is thus more than its use together with nails and planks, more than the carpenter’s skills, and more than evidence in a courtroom. The hammer survives the house.

Example 2 – Experiments in relativity

Even though I consider the Sociology of Scientific Knowledge to be unsuitable to my theoretical needs, Harry Collins and Trevor Pinch (1993) have produced a textbook example of how scientific experiments may reinforce each other throughout historical paradigms. In their chapter Two Experiments that ‘Proved’ the Theory of Relativity, Collins & Pinch set out to understand how the 1919 solar eclipse experiment led by physicist Arthur Eddington was accepted very swiftly, even though the results of the actual experiments were quite poor and inconclusive due to harsh conditions of photographing light as it was supposed be displaced by the large gravity field of the sun (and thus proving the theory of relativity). The experiment was very difficult to perform at the time, cameras had to be mounted on remote islands to be in time for the solar eclipse, and they were sensitive to temperature and vibrations due to the long exposures needed to make the photographs.

A contributing factor to the quick acceptance of the inconclusive results of the Eddington experience, was according to Collins & Pinch, that beginning in 1881 Albert Michelson (later in collaboration with Edward Morley) had performed series of experiments of a wholly different purpose. They wanted to measure the ‘aether drift’ that was thought to occur as earth moved across space. It was believed that light traveled through the medium ‘aether’, and thus the movement of the earth would produce slightly different speeds of light in different directions. These experiments, that over time took place for half a century, however failed to account for any significant variations, and thus many considered the speed of light to instead be constant.

Now, it may seem that the Eddington experiment and the Michelson-Morley experiments are disconnected. But Collins and & Pinch connect them despite being about two different things:

The way the 1919 observations fit with the Michelson-Morley experiment should be clear. They were mutually reinforcing. Relativity gained ground by explaining the Michelson-Morley anomaly. Because relativity was strong, it seemed the natural template through which to interpret the 1919 observations. (Collins & Pinch 1993: 52)

As the Michelson-Morley experiments kept failing, they unintentionally reinforced the Einsteinian relativity theory, because it presupposes the constant speed of light. The results of Michelson-Morley, even though they were a ‘failure’, could be a component part in strengthening the Eddington experiments, even though Eddington had a wholly different theoretical purpose. What I am getting at here is a somewhat dramatic comparison: Just like the hammer can be used both for carpentry and murder, scientific results, methods and machinery can be used for very different purposes, in different setups and epistemic practices. Even though carpentry and relativity physics are radically different activities, the point is that parts and components can be taken out of their contexts, since they are rendered mobile by way of blackboxing. Assemblages, architectural or scientific, mobilize and assemble their equipment, where most of them are already there. But assembling and selecting what components to choose is not only about actively knowing where to go. It is equally important to forget. Be it about the theoretical functioning about the hammer or the ‘aether wind’, exclusion is as important as exclusion.

This is, I will argue also the case for the social sciences, and especially concerning its use of quantification, which will be the topic for the next section.

(1) Of course paradigms may extend over centuries, but it can still be said that Kuhn also classifies the durability of scientific beliefs around scientists and communities of researchers.

(2) Selecting this example is a tribute to Heidegger’s tool analysis in §15 in Sein und Zeit (1972[1926]) where the hammer is used as an example on how a piece of equipment is always in relation to other objects, and that equipment has to withdraw from consideration to be used for something.

flattr this!

Hacknight

Ni har väl inte glömt att skriva in Hacknight 3, 6-7 Augusti, i kalendrarna? Inbjudan och call for papers ligger redan ute.

Förra året medverkade jag genom att presentera Telecomix olika kryptoprojekt, och Raccoon gick in på detaljerna. Sen satt jag hela natten med hackarvänner och reverse-enginerade en Fonera-router. (dagen efter var jag så trött att jag glömde vilket lösenord vi satte, men det gör inget, det lyckades ju).

Året dessförinnan, dvs. ”Summer of Datalove, E01″, så besökte jag även den första iterationen av hacknight.

Forskningsavdelningen har haft lite bekymmer med folk som tror att modem är farliga. Dessutom nämns ”Forsken” i boken Svenska Hackare. Så, vill man se lite hackarhistoria när den skrivs, så är det bara att komma!

Alltså, jag uppmanar vem som helst att komma och delta. Man behöver inte kunna mer än 9000 tekniska ord, och har man Windows på datan så är allt förlåtet. Dessutom har Forskningsavdelningen ett nytt spejs i år som jag är mycket sugen på att se. Bli gästforskare för en natt. Nätter med hack, då kan allt hända!

Vi syns!

flattr this!

How to study the social sciences, part III


Today seems to be a good day for empirical philosophy. I wrote this section after lunch, so its freshly converted from LaTeX to HTML, and thus tentative in character. This section is supposed to come after the part on blackboxing, which in turn comes after the prelude to the chapter. Or, for those interested, just read it :D. The philosophical problem is the zeitlichkeit of black boxes, how they invent their own time. The empirical problem is to describe what quantitative social sciences are really doing. Hence, empirical philosophy!

Blackboxing and time

The historicity of a black box is not necessarily so that going back in time means that they are more open. This is sometimes true for machines as they are invented, where you would usually travel back in time to find the origin of of a technology in a research lab. Blackboxing is however ‘anti-genealogy’ because when they break down, when they fail, they are as easily reversed, perhaps to an even more primitive stage as when they were invented. Moreover, due to the implicated actualism, blackboxing as a process never ends. Machines, concepts, methods and procedures have to be constantly used, maintained and upgraded to keep working, even though this is usually invisible to the ‘ordinary user’. Even though we usually never think about it, there are technicians employed around the clock to keep our mobile telephony working, scientists in labs doing everyday research in standard procedures to keep our facts straight (Landström 1998), and teachers in schools repeating the instructions of grammar to pupils five days a week.

On a more profound ontological level, this goes as well for actants, entelechies and hybrids (these concepts will be explained below):

1.2.8 Every entelechy makes a whole world for itself. It locates itself and all the others; it decides which forces it is composed of; it generates its own time; it designates those who will be its principle of reality. It translates all the other forces on its own behalf, and it seeks to make them accept the version of itself that it would like them to translate. (Latour 1988: 166)

It is thus imperative not to study blackboxing in conventional linear time frames. The SOM-institute is actually a good example of this. When going from the 1999 survey, as described above, to the 2010 survey, by looking at the methodological chapter, the box is more open than roughly a decade earlier. The method documentation chapter from the 1999 survey is six pages long (Lithner 2000), the one from the 2010 survey is 33 pages (Nilsson & Wernersdotter 2011). At least in writing, it seems to take more words to account for the same type of survey. Word quantity can however be deceiving, instead the openness of a black box is where to look, to the amount of work needed to make it work:

During recent years it has become more difficult to reach a high response rate, and moreover the SOM-surveys have increased in size. This year’s number [response rate] hence must be regarded as high. In the 1999 survey the response rate was 67 per cent, which is on the same levels as the average fourteen nationwide SOM-surveys made so far along /…/ (Litner 2000: 398)

And for the 2010 survey:

From the 2000:s the level [of response rates] has however gone down. If the average up until 1999 was 68 per cent, the first decade of the 2000:s was 63 per cent. The 2008 survey became the first one to go below 60 per cent, which was also the case in 2009. This year’s survey did however reach 60 per cent again. (Nilsson & Wernersdotter 2011: 557-558)

Then another seven pages are spent analyzing who and why some people are not responding to the questionnaires. What was unproblematic a decade earlier, is debugged, analyzed and progressively made to work again. Black boxes are deceptive in this way, they only withdraw when they function as they were supposed to. Layer by layer (or rather, variable by variable) the missing respondents are localized. One such example is the group of young men:

The lower response rate among the young groups is especially clear among young men. For men in the ages 20-29 years the response rate is 36 per cent, compared to 48 per cent among women in the same age. (Nilsson & Wernersdotter 2011: 562)

The traditional way of thinking scientific discoveries is that phenomena were discovered, then, once they are discovered, they were there all along. The standard way of thinking innovations, is that once someone invented a technology, a method or a formula, it is there for us only to use (if we just can afford it or know how to use it). This, however, only works for studying ready-made science. When analyzing science in action, time is relative to the speed of black boxes. Intercontinental telephony works with the speed of fiber-optic cables, as long as they work. But when they break down, their speed is reduced to the time it takes for technicians to localize and mend the failed components. The quantitative social sciences, I shall argue, are no different in this respect. And as we saw in the prelude of this chapter, they are no less ‘cutting edge’ than the latest cellular networks.

Off topic, kind of: So what then is cutting edge technology? Is it the latest iPhone or the skyline of Dubai? Well, so may be, but it is also your kitchen knife (pun intended), the shoes you wear, even though they might be really old, or the turbofan jet engines on your recent flight, even though its 1960:s tech. The Minicall, the fleece sweatshirt or dialup modems are however not cutting edge. They came after jet engines, still, when nobody uses them, they are no longer in networks, so they retire as obsolete artifacts in museums and deep in your closet.

flattr this!

Epistemes and networks

I wrote this little comparison of Latour and Foucault on a plane. Since I’m mildly scared of the flying monsters we call aeroplanes, I scribble down things in my hipster moleskine notebook very swiftly while taking off and landing, so I better put this text on interwebs before I insert it to my dissertation. So, the question is, would this be a correct interpretation of the differences and similarities? (of course there are many more, please comment).

In Foucault there is a double articulation in the emergence of the social sciences; on the one hand there is the qualitative function in biopolitics, as an administrative, surveying and organizing science, in what he called the emergence of disciplinary societies. On the other hand, the positive domain of knowledge became possible through the void that had to be filled as there was a ‘tectonic’ rupture between the classical episteme and the modern episteme, a reconfiguration that was external to the social sciences themselves, and occurred in conjunction with how the other sciences rapidly discovered new grounds of knowledge.

This could have been another way of describing what Latour calls the modern constitution, if it were not for the drastic philosophical differences between Latour and Foucault. Latour argues that the purification of the modern constitution is an ever ongoing, tedious process. If it is not maintained, it breaks down, and we realize that all we have are ‘savage’ hybrids. The modern episteme, as described in The Order of Things, on the contrary, would postulate that the qualitative reconfiguration that took place towards the end of the eighteenth century, made thought possible in only one particular way(1). This has sometimes been called a ”structuralist” explanation, even though this is a bad word(2), both since it is quite empty of meaning, and because its association with linguistics.

The key figure to understand these differences is what I previously mentioned as actualism. But first, there is a similarity, in at least one respect. In a very interesting passage in Pandora’s Hope, on how human and non-human agency are related, Latour goes:

Purposeful action and intentionality may not be properties of objects, but they are not properties of humans either. The are the properties of institutions, of apparatuses, of what Foucault called dispositifs. Only corporate bodies are able to absorb the proliferation of mediators, to regulate their expression, to redistribute skills, to force boxes to blacken and close. Objects that exist simply as objects, detached from a collective life, are unknown, buried in the ground. (Latour 1999: 192-193)

The concept of dispositif and assemblages are closely related, in their collectivity, positivity, and also in the sense that they are actualist concepts. They configure and enable a collectivity of human and non-human agency to express knowledge in specific ways. The Hubble telescope would be such a dispositif, or assemblage, composed and held together by hundreds of scientists, thousands of technical components, billions of dollars and even the gravity of planet Earth. All of these links need to be aligned or the black boxes have to be patched and fixed. And the result is nothing less than images of distant galaxies. Remove the humans, and the telescope slowly runs out of power or burns up in flames as it falls through the atmosphere. Remove one lens, and we see nothing more than before Galileo. Thus, it is not the biopolitical side of Foucault that is a problem for Latour.

But, the problem instead is the model of epistemes. Indeed, along the lines of Foucault, Latour also acknowledges ”Kantianism” as one of the leitmotifs of modern thought (pre-dated by Hobbes and Boyle, see Latour 1991: 57ff). But while introducing hybrids, the amodern networks, which when multiplied fold together heterogeneous elements through moments of utterly concrete translations, there can be no prior historical rupture, as Foucault argues in The Order of Things, no void that emerges simultaneously in all the sciences. To put it in another way, Linnaeus and Darwin, even though the former belonged to the classical episteme and the latter in the modern, they would according to Latour have done the same primary things; collapsed the inside/outside division by bringing samples of minerals, birds and flowers back into their ‘labs’, inscribing them into systems, classifications; forced them to crack open, while struggling with kings and churches, perhaps even public opinions to support their assemblages. Neither Linnaeus nor Darwin was ever modern, even though the latter lived in a time when the proliferation of hybrids had become much more swift, more efficient, desired by institutions of immunology, public health, anatomy and medicine.

Footnotes
1.For example the presence of fossils was an unthinkable figure, a monster, before modern biology introduced ”historicity” into Life, see the chapter ”Monsters and Fossiles” in The Order of Things

2. Foucault himself rejected this label as nothing but fancy words of ”commentators”, see ”Preface to the English edition in emph{The Order of Things} INSERT PAGE}

flattr this!

1999 – How to study the Social Sciences, part II


Last week i published an early draft from my dissertation. Here is a continuation, but it should actually come before the first draft. I realized that the moments of translation from facts circulated in the news media usually don’t just go straight from scientific literature to the press. It has to be mediated, or translated, to use some actor-network theory vocabulary. So I attended one of their press conferences and found one such moment of translation in action. So, here is a rough and early empirical draft. Oh, and if anyone knows how to translate ”sakföreteelse” into English (in the context below), please comment!. Update: I heard it all wrong on my recording, it is only ”företeelse”. No wonder translation was difficult :D

Prelude: a moment of translation

How do you go from a fivehundred-page research report to a brief newspaper article? Each year, since 1986, the SOM-institute has published their findings and results in a large volume. The results are however not only meant to stay inside the academic ivory tower, the SOM-institute actively circulates them, and they appear in public debate on several occassions. You could of course read the report for a couple of days, then write a summary or a review. But it is much more convenient to have the results of the report summarized and explained for you by someone else. And along the same lines, if you want your research to circulate outside the report, to reach out to people who do not have the time or means to spend a couple of days in the library, as a researcher you need to translate the numerous words, graphs, tables and conclusions to a compressed yet credible statement.

One such moment of translation are the press conferences that SOM has held since XinsertyearX. I visited one in XinsertyearX in Stockholm, and one in Gothenburg in 2011. The latter one I recorded and analyzed with translation as a focal point.

When you enter the press conference you get a copy of the fivehundred-page report handed out to the audience of about 25 people, most of them academic researchers and reporters. The public service TV broadcaster Sveriges Television is filming the event, and on the whiteboard the Twitter hashtag #somgu has been written. As the second largest city, Gothenburg is not considered to be the epicenter of media impact, so the conference takes place in an ordinary lecture hall at Annedalsseminariet, where some of the social science departments are based.

The press conference is opened by the three editors of the report, Lennart Weibull, Sören Holmberg and Henrik Oscarsson, the two former introduced as the co-founders of the institute. At first, Weibull presents how the survey was made, while referring to the report that was handed out while people walked in the room:

Here we have everything. The SOM-institute is a scientific institute where we work extensively with methodological developments /…/ Thus, the sample is 9000 and make three emissions and [of] questionnaires. One is more [focused on] political, one more on media and culture, one a bit more on life-styles and health. Our base questions in the SOM-survey are in all three of these questionnaires. This is not something you need to know, since it is all here [in the report]. But if one is interested too look it up [more closely] /…/ from page 595 and onwards you have the three questionnaires in extenso documented in the book. (my italics)

Weibull summarizes how the survey was made, but the details are too exensive to give in a two hour seminar, so they are referred, or linked, to a particular page in the report. To get to the questionnaire, you need to go one more step.

The press conference continues with short presentations of each of the chapters in the report. The audience learns that levels of trust in political institutions are on the same high levels as during the seventies, that political interest increases close to elections, that Swedes have more positive attitudes to immigration, that women are more active in social media, and that people living in rural areas far away from the center of decision making are more skeptical towards wolves in the forests than people living in urban environments. Every now and then, especially when a number or graph is quoted, the report is once again linked with statements such as ”As you can see on page X in figure Y”. Some of the descriptions are general and some take a more technical turn. Sören Holmberg for example describes new techniques for measuring ”job performance” for evaluating how public institutions are perceived:

We have been inspired by American research on consultants and politics when we made our measurements, our financial ratios, as [presented] on page 109. It is [called] name recognition, you have to know the SAKFÖRETEELSE to be able to associate it with values. Secondly, evaluation; not of trust this time which we measure in other instances, not with personal satisfaction, but the evaluation of job performance, how you perceive that the job you are expected to do, how well it is done. That is called job performance in American [English]. (The words ”name recognition” and ”job performance” appear in English in original)

The results, on page 109 and narrated by Holmberg, are quite devastating for two of the institutions that were measured. The Försäkringskassan (Social insurances) and Arbetsförmedlingen (Swedish Public Employment Service) indicate low job performance.

Another interesting finding is presented by John Magnus Roos, a researcher at the Centre for Consumer Science at Gothenburg University. He argues against the widespread belief that so called ”shopoholics” are mostly women buying purses, makeup and clothes. On the contrary his data shows that gender is not an important factor. Rather the shopoholics are people young who are dissatisfied with life in general, and that their degree of empathy is lower than average. The following day, this is reported by the local public service radio station P4 Göteborg where Roos is interviewed by the reporter Anna Olofsson, who attended the seminar. The radio station publishes an interview on their website the following day (Olofsson 2011):

- If we know more about the personality type [of the shopoholic] then we can both prevent these problems and help the person in need of support, says the researcher John Magnus Roos.

Thus, from this article, it is possible to ”reverse-engineer” a widely circulated fact, back to a press conference, which in turn refers to a scientific report. But to go further, to translate from the easy reads of news media to the esoteric science, we need to go to the reports. Or, we have to open more black boxes.

To continue reading about black boxes, please go to the previous post

flattr this!

Samhället som helhet, del VI

Hur kommer det sig att det gick ett Intercitytåg mellan Värnamo och Göteborg för en vecka sedan? Om vi konsulterar expertis inom olika områden får vi olika svar. En ekonom skulle kanske säga att givet vissa investeringar, statliga understöd, konsumtionsmönster, arbetsförhållanden och marknadsföringsstrategier så blev det ekonomiskt lönsamt för ett samhälle att satsa på en tågförbindelse just här. En ingenjör skulle kanske svara att vi för ett halvt sekel sedan hade bra elmotorer, god elförsörjning och bra kullager, och genom att sätta samman dessa kunde vi bygga bra tåg. En fysiker skulle kunna svara att effekten i motorerna var tillräcklig för att överstiga de kiloNewton som krävs för att flytta några hundra ton framåt, givet att friktionen och luftmotståndet var tillräckligt lågt. Och så vidare… (en litteraturvetare skulle kanske tala om tågets roll i fiktionen, som symbol för modernism och framsteg).

Dessa traditionella förklaringsmodeller har som gemensam nämnare att de ofta faller tillbaka på en kraft utanför det partikulära fenomenet. Det kan röra sig om ”ekonomin”, ”teknologin”, ”fysikens lagar” eller ”den symboliska världen” i fallen ovan. Detta medför att själva objektet decentreras och måste ge vika för en reducerande totalitet.

Finns det då ett annat sätt att tänka alla de artefakter som ständigt omger oss. Vi klarar inte att gå utanför lägenheten eller huset utan att foga proteser till våra kroppar (kläder), proteser som gjorde det möjligt att vandra ut ur savannernas tempererade klimat och gjorde det möjligt att uthärda den bistra småländska kylan. Idag talar vi ofta om H&Ms börskurs, klädfabriker i Kina och intellektuell egendom på Gucci-väskor. Men för stenåldersmänniskan var kanske Mufflonfår på modet, utan att varken förutsätta pengar, börser eller marknadsanalyser.

Latour har en lösning, som sedan plockas upp och modifieras av Harman (som diskuterades utförligt i förra bloggposten och dess tillhörande kommentarsfält. Big ups till alla som kommenterar på denna bloggserie!).

”Lösningen”, eller kanske snarare utgångspunkten, för Latour är den ”svarta lådan”. Dock leder i viss mån ”låda” till att man lätt tänker sig något statiskt. Begreppet introduceras i Science in Action från 1987, där det rånas från cybernetiken där det myntades en gång för länge sen. Men svarta lådor fortsätter att vara en essentiell byggkomponent för Latour även långt fram i tiden. I den fin-du-millennariska (undrar om jag stavar rätt där) essäsamlingen Pandora’s Hope (1999) har den svarta lådan modifierats en aning och dyker upp som ett verb istället; blackboxing.

Men för att vara konservativ börjar vi 1987. Här definieras den svarta lådan som något som reducerar komplexitet och smälter in i våra liv till den grad att vi endast behöver bry oss om input och output. Vi behöver inte veta hur varje detalj i en mobilmast fungerar för att ringa, vi behöver inte veta något om väteatomer för att dricka ett glas vatten och en orkidé behöver inte veta något om en getings fortplantningsmekanismer för att ingå i en symbiotisk relation med den.

Latours paradexempel är dock Eagle-mikrodatorn (bild ovan), som måste tinkas och fixas med för att möjliggöra upptäckten av dubbelhelixstrukturen för DNA.

But it was not a good machine before it worked. Thus while it is being made it cannot convince anyonebecause of its good working order. It is only after endless little bugs have been taken out, each bug being revealed by a new trial imposed by a new interested group, that the machine will eventually and progressively be made to work. (Latour 1987: 11, italics in original)

En fungerande svart låda märker vi sällan av förrän den bryter samman, eller i ett tidigt skede när den fortfarande inte är färdigbyggd. För det mesta leder en svart låda till en annan. En gång när jag gjorde en yttre resa med ett x2000-tåg stannade det. Lokföraren sade att de behövde ”starta om en vagn”. För ett kort ögonblick blev jag varse om att hela tåget bestod av vagnar (mindre komponenter) och att dessa (troligtvis) hade en dator i sig som hade hängt sig. Men, om datorn fick jag inte lära mig något mer än att den behövde ”startas om”. Kanske visste inte lokföraren mer än så heller, men kanske skrevs en incidentrapport och en ingenjör fick i uppgift att mödosamt söka i loggfilerna efter ett fel. Kanske skulle det visa sig att det var fel på en strömbrytare. Då skulle denna nya svarta låda behöva öppnas. Kanske visar det sig då att den frusit sönder. I en tänkbar framtid skulle då en ingenjör på Siemens kanske sitta och rita nästa generations strömbrytare för tågset i ett mera temperaturbeständigt material. Och så vidare… lådor på lådor på lådor. Så länge de funkar kan de vara stängda och vi ”tror” på dem.

Harman går ett steg längre i sin läsning av Latour, i Prince of Networks. När Latour skrev Science in Action syftade han till att skriva en ”lärobok” för STS-studenter. Harman läser den dock som metafysik (helt rätt gjort!):

For Latour, the black box replaces traditional substance. The world is not made of natural units or integers that endure through all surface fluctuation. Instead, each actant is the result of numerous prior forces that were lovingly or violently assembled. While traditional substances are one, black boxes are many—we simply treat them as one, as long as they remain solid in our midst. Like Heidegger’s tools, a black box allows us to forget the massive network of alliances of which it is composed, as long as it functions smoothly. Actants are born amidst strife and controversy, yet they eventually congeal into a stable configuration. But simply reawaken the controversy, reopen the black box, and you will see once more that the actant has no sleek unified essence. Call it legion, for it is many. (p. 34)

Å ena sidan är svarta lådor mycket enkla när man tänker exempelvis människans geologiska framfart, som gör det rätt svårt att i sin tur ens tänka artonhundratalets identitära begrepp ”natur”, ”jord” eller ”jordklot”. När vi stoppar en skiva bröd i munnen äter vi en ”legion” av svarta lådor, från det att den första stenåldersmänniskan började domesticera gräsarter som sedan maldes med stenar till mjöl och bakades över den Prometheuska elden, till det intrikata nätverka av lastbilar som fraktar Hönökaka från en datoriserad fabrik till Konsum. Brödets transsubstantiation är inte en inre egenskap, utan flera relationer. Och dessa relationer måste hela tiden göras, annars försvinner de. Harman igen:

Third, Latour’s black boxes do not automatically endure through time, unlike most traditional versions of substance. Since they are events, they include all of their relations as parts of themselves. But since these relations shift from moment to moment, the black boxes do not endure for more than an instant, unless we consider them as ‘trajectories’ crossing time across a series of minute transformations. They must also be constantly maintained. This makes Latour an ally of the doctrine of continuous creation, which is also a frequent feature of occasionalist philosophy. (p. 46)

Detta leder åter till en processontologi, eller relationism som Harman kallar det. En svart låda som undandrar sig från alla relationer finns inte, i Latours tappning. Innan man avfärdar detta med spontan externrealism, så måste man tänka på att relationerna kan vara vad som helst. En trasig bilstereo på en soptipp är förvisso inte längre en ”bilstereo”. Men den är fortfarande tusen saker: 700 gram metall i ett helt nätverk av återvinningsprocesser, ett konstverk för en diskbänksrealistisk soptippsfotograf, en miljöfara enligt lagen om farligt avfall, ett bo för bakterier som trivs i syrehaltiga miljöer, en potentiell livsfara för en fiskmås som letar föda. Att vara en bilstereo, tågvagn, dator eller stol, att vara det som vi ”menade” med den svarta lådan, är bara en händelse. Kosmiskt sätt är hela jordklotet en megastor svart låda, vars inre komponenter vi för det mesta tar för givna, även om vi då och då är upptagna med att fixa ”buggarna” (global uppvärmning, kärnavfall i Fukushima, utfiskade hav, diktaturer med galna ledare).

Den svarta lådan korsar tvärt igenom det som Latour sedan kom att kalla den ”moderna konstitutionen”, alltså tron på, och tanken om, att natur och samhälle är två separata poler, som har gett upphov till de två stora filosofiska misstagen; A) Att samhället orsakas av naturen (naiv realism, reduktionism), eller B) att våra sociala konstruktioner styr hur och vad vi uppfattar som natur (humanism, korrelationism, kritik). Svartlådning gör ingen skillnad mellan när vi upptäckte ljusets krökning vid stora gravitationsfält 1919 eller när vi tillverkade en bronsyxa i Thailand 3000 f. Kr.

flattr this!

Samhället som helhet

Den senaste tiden har det varit väldigt tyst här på bloggen av den enkla anledningen att jag försjunkigt med hela min tillvaro in i de akademiska långsamma hastigheterna. Med andra ord har jag arbetat intensivt på min avhandling, som enligt planerna ska vara färdig någon gång i höst.

Avhandlingen behandlar många problem. Men ett av dem kan man säga är filosofiskt, alltså hur är helheter, totaliteter, sammanhållna entiteter möjliga?

Detta problem diskuteras bland annat hos Kalle (jordbruk och ekonomi) och Rasmus (kapitalet).

Man skulle kunna tänka sig att frågan om helheter blott är en filosofisk-exegetisk manöver, något som inte egentligen har några konsekvenser annat än för tänkandet självt. Men så är det inte riktigt. Vad som finns, är vad som vi agerar efter. Finns ekonomin som en helhet? Finns staten? Finns samhället? Finns liv? (i den moderna biologin).

Även om de kanske mest spännande diskussionerna här pågår i den såkallade bloggkloaken, så letar sig som tur är dessa frågor då och då in i akademiska texter. Ett aktuellt exempel är Johan Söderbergs avhandling Free software to open hardware: Critical theory on the frontiers of hacking (som jag ännu inte läst i sin helhet, utan endast den kappa som finns länkad).

Inom STS-forskningen har man länge ignorerat helheterna och istället sysslat med fallstudier, och empirinära mikroanalyser av hur bland annat vetenskaplig kunskap blir till, som en praktik, ofta med betoning på det konkreta istället för det abstrakta. Söderberg föreslår därmed att STS kan befruktas med kritisk teori för att på så sätt kunna vässa sin kritiska skärpa:

In one sentence, this is the dialectical heritage of the former which clashes with the post-structuralist influences of the latter. In particular, a key sticking point between the two traditions is the concept of ‘totality’. What political strategies follow from either maintaining or abandoning this concept? /…/ It is from the point of view of the social whole that critical theory claims to be able to transcend the horizons of the individual actors themselves. In other words, this philosophical idea is the key for engaging in ideology critique and for guiding praxis.

Söderberg menar alltså att när man överger totaliteter så förlorar man, eller åtminstone riskerar man att förlora, en viktig form av ideologikritik och därmed även en strategisk kompass för politisk handling. Det krävs ett ‘social whole’ för att ta sig ur individuella erfarenheter, ut ur mikropolitiken, och in på en samhällskritisk arena. Å ena sidan är detta en utgångspunkt för nästan all modern sociologi, från Durkheim till Marx. Å andra sidan formerar totaliteter utgångspunkterna för nästan alla politiska rörelser.

Den givna frågan, som intresserar mig väldigt, blir då huruvida det motsatta perspektivet uppfyller eller inte uppfyller dessa kriterier. Vad händer om delen är större än helheten, om komposit föregår totalitet, och om sammanhållna entiteter endast håller samman av på grund av pågående processer? Om axiomatik föregås av rhizomatik?

I ett mycket preliminärt skede av min avhandling, som på ett plan syftar till att förklara hur dessa ‘social wholes’ blir till, ställer jag följande fråga (med utgångspunkt i staden Borås):

The main problem, as well as the main challenge, then becomes how these elements hold together, not by a totality, nor by a certain logic, but by way of historical consolidations; sedimentations which were once fuzzy, then progressively growing harder (the opposite, disintegration, is of course also possible). To say that, for example, the inhabitants of Borås think and feel in a particular way, is only possible to do with accuracy and credibility if there is something which embodies that statement. The social sciences never depart from a clean slate, they do not appear out of nothing. Rather, they need a ’full body’ of composite parts, which are aligned in certain configurations. I want to see how they have been connected, how they have been assembled in a fashion which today renders the quantitative social sciences able to speak in the name of the Borås urban dwellers, or for that matter, any other object as which falls within epistemic domains.

Min fråga här är hur ”samhället som helhet” kan bli till som ett epistemiskt objekt? Exempelvis Durkheim löste detta problem, mycket förenklat, med att förutsätta att det fanns ett sanktionerande normsystem som överskred individen, som var mät- och beskrivbart. Nu handlar inte min avhandling om Borås (vetenskapliga mätningar säger att Borås är en tråkig stad), men min fråga är hur kan vi tänka Borås, som en del i en helhet (Sverige), som en del av en kultur (ytterligare en totalitet) som följer vissa mönster och upprepningar, som både konstituterar och är konstituerat av ett ‘samhälle’.

Så, i sin förlängning har denna debatt en slags yttersta spets: Leder ett avfärdande av totaliteter till ”metodologisk Thatcherism” eller inte? Att det finns totaliserande processer är det nog ingen som underkänner: De går under en rad beteckningar såsom fascism, kapitalism, byråkratisering, asketism, etc. Frågan är kanske istället, är de entiteter eller processer, sammansättningar eller objekt, relationer eller bifurkationer, cement eller sediment?

För den intresserade finns ett mycket tidigt utkast av min avhandling på ett avsides kryphål i webben. Observera att detta är ett tidigt och ofullständigt utkast, som mycket väl kan förändras mycket innan det blir en så kallad avhandling av det. Detaljer om seminariet finns i pdf-filen.

För att ytterligare beskiffra webben finns den även som AES256-krypterad fil, med namnet insurance.aes256. Avkryptera i Linux medelst openssl enc -d -aes-256-cbc -in insurance.aes256 -out fil.pdf. Lösenord: fraktalpolitik.

flattr this!

The religious practices of copies and uncertainty

Recently there have been a few news reports on The missionary church of Kopimism

Isak Gerson of the church says to Christian daily Dagen:

-Why would you like to be recognized as a religious-supporting organization?

-We want to be accepted by the state and society. We feel that we are met with harsh attitudes and argue that the freedom of religion should include us as well.

Much of the reception of the appearance of the church has been framed as ”file-sharing”. A computer file is however only one organizing arbitrary information for an operating system or program to interpret how it should be technically handled. The various interpretations of Kopimi are much wider.
It is not surprising that there is an emphasis on file-sharing and copyright, as there is a raging persecution of this particular instance of practice. A few years ago, Kopimi was defined as a ‘sect’ by copyright lawyer Monique Wadsted:

She means that there exists a small sect of so called Kopimists around The Pirate Bay, however [this sect] is not a wide ideological movement with support from a majority of Swedish youth.

These struggles are nevertheless of earthly matters. Religious persecutions of sects have, as history tells us, only led to them becoming even more revolutionary on a social plane. The priests of modernism, among them Émile Durkheim defined religious life as a set of human beliefs, which were institutionalized into churches as safeguards of morality. If there is a social program of Kopimi, it would be to abolish copyright and intellectual property. However, this is merely a superficial plane.

Another closely related religion is the Hierophants of cipher, which has a sort of social program of establishing a state of cryptoanarchy. The Hierophants of cipher are hard to get hold of, since they were founded and only dwell in a non-territorial and non-vanilla internet location inside so called darknets; encrypted and fully distributed computer networks. Being hard to find is one of their principles, so no direct sources of information are really easy to get by.

Lets instead take a closer look at what the esoteric content of Kopimi and Ciphernetic Hierophantics are. As religious classifications they would be categorized as non-theistic mysticism. There are no ‘despotic signifiers’ (Gods, creators), and the path to salvation is ritualistic. These rituals, however, differ from most mystic practices, as they are not transcendental, but rather ‘object oriented’. The actual act, to copy or to encrypt, are the fundamental building blocks of worship.

As the religions are not ”book religions”, they have no written doctrine and we must rely on the aesthetic and ethical expressions.

Kopimism can be said to be energy-centric. All copies require energy, be it the electricity driving computers, the carbo-hydrates burned in sexual reproduction or the solar particles that make plants grow. The pyramid is historically the clearest case of a whole civilization wasting energy, not only for aesthetic purposes and spiritual harmony for the afterlife and as a way of preserving a hierarchic order of society throughout time.

Thepiratebay.org conjugates flows of energy, a single torrent may consume vast amounts of energy as thousands of computers are instructed by their users to transmit data over the internetworks. The pistons of the S23M are interrupted and release energy flows of diesel oil, and at the Walpurgis ritual the papers of the doctrine were burned and went up in flames.

Hierophantic ciphernetics has a different base for their practices. Using approaches from quantum mechanics and fractal mathematics (sciences very closely related to what is considered religious beliefs in modern societies), the practices center around the imperceptibility and indestructibility of (anti)systemic chaos. For example, the myth of the ”Ciphercat” is an allusion of Schrödinger’s uncertainty principle, and many of the hierophantic practices concern different fractal geometric functions, such as figures on the complex plane. Cameron, one of several interpreters, puts it like this:

Information is nothing but numbers, numbers governed not by human laws, but by the laws of mathematics. Networks that utilize the power of cryptography already exist. It will not be possible to stop the spread of the fractal cipherspace. /…/

It is either the Riemann sphere, the complex plane, as the number of 12 deadly missile strikes in Pakistan in january, of which 10 killing 123 civilians. Two drone strikes killed three al-qaeda leaders.

This state of affairs is also described in the film Naqoyqatsi (torrent), where fractal mathematics, computing, and the destruction of authority are key motifs.

The two religious expressions combined, creates an imperceptible energy system. Machines driving other machines, surfacing only by surprise. The inherent sectarianism, or, communion of friends, is a necessity for a common principle, that of the event. Every copy is a displacement of another copy. Every instance of a fractal haecceity exists in perfect individuation, even though its continuation is eternal. Every communion is a particular energetic flow/interruption of energy, and its historical continuity depend not on solid functions, but on repetitions along a filament line of self-similarity.

flattr this!

Hyrdoktorand på Juridicum?

Johan Axhamn bredvid Monique Wadsted och Jan Rosén

Johan Axhamn har ju som bekant skrivit ett rejält snömosinlägg i SvD angående internetblockeringar, där han föreslår att ”berörda intressen”, alltså upphovsrättsindustrin, ska få kontroll över någon form av censurlistor på internet.

Min personliga åsikt är diametralt motsatt. Internetoperatörerna måste försvaras. Mere conduit måste försvaras och specialintressen från meningslösa branscher måste hållas borta från vårt viktigaste sätt att kommunicera med varandra.

Eftersom jag själv är doktorand så är jag intresserad på ett frågande snarare än ett anklagande sätt att googla runt lite kring Axhamns meriter. Men det ser ju lite konstigt ut ändå med ett CV som ser ut så här:

- Sveriges rapportör till Kluwer Copyright Cases.

- Sveriges rapportör till Thomson/West Copyright Through the World.

Dessa två företag (ja, de är företag) är ganska osympatiska. Kluwer är ett av de där jätteförlagen inom akademin som gärna ser en konserverad upphovsrätt, och Thomson, som egentligen heter Thomson Reuters har i princip monopol på metadata inom citeringsindex, inte bara för juridiska verk utan för hela Science citation index.

Några andra meriter är:

- Medlem i Institutet för Immaterialrätt och Marknadsrätt

- Aktiv i Upphovsrättsföreningen.

Utöver detta behöver kanske inte uppdragen för Nepotia nämnas. Hur långt kan man gå som doktorand är frågan? Vad hände med det vetenskapliga ethos disinterestedness

flattr this!