Ever since Recep Tayyip Erdogan moved from being prime minister to president of Turkey in 2014, the country’s politics have continued an alarming drift towards autocracy. Erdogan has taken his strong party identity and command-and-control style with him – and is seriously eroding the nation’s checks and balances on personal power.
Turkey’s various presidents have been men of party political and military backgrounds alike. Though it would be naïve to suggest that none of them had any pre-existing political agenda, the record of direct party political manoeuvring is scant.
The previous president, Abdullah Gül was often condemned for his uncritical ratification of legislation passed by parliament, but in general he made an effort to stay above party politics – Gül and Erdogan shared a background in the conservative Justice and Development Party (AKP). Gül’s predecessor, former constitutional court judge Ahmet Necdet Sezer, was a firm check on the early years of AKP governments.
But things are different now. The structures that hold back the increasing authority of Erdogan and his party have been under attack for some time – and Erdogan may be on the brink of finally overwhelming them. He is quite openly manouvering to concentrate power in his person rather than the office he holds, and he has been doing so for some time.
The Gezi Park demonstrations in May and June 2013, for instance, were sparked in part by his arrogant statements on municipal issues in Istanbul, blithely overriding the governor, mayor and city council.
When a massive corruption scandal broke in December 2013, Erdogan became combative. Wiretaps were released implicating AKP ministers, Erdogan and their sons in wide-scale embezzlement. Erdogan first dismissed the wiretaps as forgeries, then held them up as evidence of a conspiracy.
But ultimately, any “conspiracy” against him clearly failed, as 25 police officers and various others were arrested in raids against those who instituted the wiretaps in the first place.
This was just one of many attempts to reign Erdogan that have failed. After the wiretap scandal, he not only bounced back, but campaigned to great effect in the municipal elections of March 2014, sometimes appearing simultaneously in different places by way of a hologram. And despite the previous year’s upheavals the AKP won a majority across the country.
Neither Erdogan’s overreach nor evidence of corruption moved the electorate against the AKP. The verdict seemed to be “they steal, but they work hard,” in contrast to previous more secular-minded governments which were also accused of corruption, but were not seen to be working for the good of the country.
And while the AKP certainly benefited from heavily favourable coverage by the state broadcaster TRT, the charisma and personal power of Erdogan himself was also a major factor. Any attack on Erdogan simply seems to galvanise his supporters behind him.
Now Erdogan is president, not prime minister, he is meant to be on a much tighter leash. Article 101 of the Turkish constitution makes it explicit that the president must sever all connections with their party. But Erdogan is not just flouting this core requirement; he is openly campaigning for his party in the run-up to the 2015 general election.
Erdogan has also been giving a series of lectures to “muhtars“, village and neighbourhood officials who are elected but not affiliated with political parties. Since these officials have local influence and a role in registering voters, recruiting them to a party political agenda is also against the law.
Most shockingly of all, Erdogan has actually started asking the electorate to return 400 MPs for the AKP, which would provide the AKP government with the majority it needs to unilaterally amend the constitution. For the president to make this plea at all is illegal.
Regardless of what happens in the election, substantial damage has already been done. The previously ceremonial chair of the presidency is rapidly being turned into a powerful executive post, drawing influence and authority from a Parliament subservient to the person rather than the institution.
It is not inconceivable that if they were elected, 400 AKP members of parliament (out of a total of 550) under the de facto leadership of Erdogan could vote to rewrite the constitution and overnight make his currently illegal electioneering legal – and along with it, his radical effort to gather ever more unaccountable power for himself.
Japanese universities may have been born out of European models, but they have set down their own firm foundations since the opening of the University of Tokyo in 1877.
The higher education system in Japan is a hybrid one, with public and private universities, both regulated by the state. There were 86 national universities and 603 private universities in Japan in 2014. Add to these 92 municipal universities and there were a total of 781 universities. But this national system is in crisis today: the government and the minister of education has focused on improving Japan’s place within the global system of higher education, and last year announced extra funding available for “superglobal” universities.
A political push to get ten Japanese universities into the list of the top 100 best universities in the world by 2024 is unlikely to improve the situation, with universities still suffering from financial difficulties. Tuition fees are rising year by year and scholarships act as a system of loans to cover this. For example, for the first year of politics and society at Waseda, a private university, students must pay 1,300,000 yen, or £7,275. For a national university, the price would be closer to 680,000 yen.
It’s worth looking back at where Japanese universities came from to understand the predicament they face today. The French sociologist Pierre Bourdieu wrote about the nobility of the Japanese state, dating back to the eighth century and stretching to the 20th century. According to him, the Meiji restoration of the 1860s was a conservative one, started by the Samorai with few resources in an effort to transform the symbol of the nobility – its bureaucratic civil service.
To understand the continuing role of nobility in Japanese higher education today, it’s important to understand the relationship between Japanese and foreign languages. Before the Meiji period, the Japanese were inspired purely by a Chinese model of written language: in the far east, the equivalent of Latin for Europeans was Chinese. In Japan, people read Chinese and pronounced it in a Japanese way, remodelling the words with a Japanese syntax.
In 1877, the University of Tokyo was created, allowing foreign professors to teach in their own language. As a prerequisite to their course, students had to learn a foreign language for three years. After their studies in the university, they were sent abroad by the state to deepen their knowledge and then became teachers on their return, this time teaching in Japanese. This was the way through which the Meiji government wanted to modernise the country and assure its independence.
With enough educated people to carry out higher education, Japan didn’t need to resort to foreign teachers. A second phase began, started by the creation of the Imperial University in 1886 in which teaching was all in Japanese and foreign language as a means of education was excluded.
The reality of this priority soon became clear to Japanese academics. Soseki Natsume, an English literature professor at the Imperial University of Tokyo, was one of the first victims of this Japanese cultural system. When he was sent to England in 1901 by the state for his studies, he suffered a nervous breakdown, because he couldn’t adapt to life in London. When he returned, he quit his post of teacher and chose to become a novelist. But it’s thanks to him that modern Japanese literature became possible. After the end of the Meiji era in 1912, passing through the Taisho era, and then into the Showa in 1926, the culture of translations from European languages into Japanese blossomed. We could call this the creation of the Japanese “Bildung”, in reference to the ideal of education as a form of self-cultivation set down by German educationalist Wilhelm von Humboldt in his conception of the modern university.
This cultural space for Japanese people was ideologically very closed. The majority of universities and intellectuals of the time couldn’t criticise the authority of the imperial regime. After World War II, the pre-war university system was totally revised, but the Japanese space for language stayed the same. American democracy hadn’t succeeded in transforming the country. The expansion of the higher education system in Japan, built on a vague idea of a university open to everybody, has actually just multiplied the number of private universities in the country.
But since the 1990s, a third stage has started. It’s now English which has become a hegemony. If you can’t speak English, you have become a second class citizen. We speak in Japan today of an “English divide” to explain this predicament, in the same way we talk of a digital divide.
The universities where the Japanese students are taught in English are considered more excellent. Admittedly there is a chance that Japanese higher education is finally opening up to the wider world, but it also presents a great menace to Japanese culture.
It’s worth remembering that the idea of the university was founded on the basis of a Christian religion. The modern university was conceived by Humboldt to exist as a universal, secular institution, normally detached from all sort of religious symbolism. A lot has changed since then, but I’d argue that the origins for education to remain free and to offer scholarships for all students, must stay alive. If Japanese universities are now suffering the policy of “globalisation”, it’s because the country’s higher education system largely side-stepped this historic basis of its universities.
This article is part of a series on Universities at the crossroads.
The Week 10 Modern and Contemporary History Research Seminar is on Wednesday 18 March 2015, at 17:15h in the Rodney Hilton Library (note change from usual time). It will be delivered by:
The role of the university as a place of education and research, as an employer, and as an important part of the social landscape has changed dramatically in the last decade.
As PhD students from various European and North American academic backgrounds, we are keenly aware of these developments and have been involved within or against them – often both at the same time. One of the most pressing issues from our perspective is that of the workforce in universities, especially the collapse of working conditions for many academic and non-academic staff.
Professors, who once enjoyed excellent working conditions in Europe and North America, are now being subjected to stricter, stranger, and more noxious standards. They are pressured into constant external grant applications, and are threatened with severe sanctions if the administration considers the results of this search inadequate. The case of Stefan Grimm, a professor at Imperial College London who was found dead in September 2014 shortly after a distressing email exchange about funding, is one tragic example.
Professors are increasingly being judged according to various forms of ranking, both state sponsored (such as the Research Excellence Framework in the UK) and international ones such as the Shanghai ranking and the Times Higher Education ranking of global reputation. These rankings, as Cambridge historian Stefan Collini argues, do not actually reflect the excellence of the research, or the quality of the university. And yet, they matter tremendously to university administrators, students, and state officials.
Of course, professors are not the only academic workers at a university. There are throngs of other individuals involved in the production of knowledge. These include temporary teaching staff, “research assistants”, or graduate students who often combine their own thesis-related work with teaching and with non-thesis related “research assistance”. It has been argued that some of these schemes provide valuable experience for graduate students, allowing them to be more competitive in the clogged-up academic labour market.
But this experience can come with unpleasant strings attached, such as less than adequate working conditions. Or teaching opportunities without pay, as recently proposed by our own institution, the European University Institute.
Temporary teaching staff are frequently employed in dire conditions, as in the United States, but also in the “social-democratic paradises” of Scandinavia. High competition, low pay, few to no benefits and very unstable contracts have become the rule, rather than the exception. In Norway, for example, as much as 20% of all university and college employees are hired on temporary contracts.
Such harsh conditions make it particularly difficult for members of historically disadvantaged groups, such as women, people from lower social classes, and those with a migrant background to succeed, as they are the ones most affected by the low pay and lack of benefits. The result is a less socially and intellectually diverse university.
We should not forget that an often neglected but huge part of the university-employed labour force consists of non-academic staff. As an institution, the university does not simply produce knowledge – it also consumes a vast amount of services. These run from university administration to cleaning and catering.
The workers who perform these tasks are to a significant extent, the life-blood of the university. And yet their important contribution often remains unnoticed even when their working conditions, and therefore their livelihoods, are being attacked, as has happened in recent years. As with young academics, those who are overwhelmingly affected by these degrading labour conditions come from underprivileged backgrounds. They are often women, migrants or both and do not usually have ready access to the media to fight back.
In late 2011, in Montreal, members of the McGill University Non-Academic Certified Association went on strike for almost four months. They did so in opposition to a new contract proposed by the administration. The university wanted wage cuts in real terms, and negative (or dangerous) changes to benefit schemes including pensions.
Across the Atlantic in 2013, students and staff at the University of Sussex, occupied a medical school lecture theatre, protesting against the university’s continued privatisation of services that threatened working conditions of staff including porters, caterers and security workers.
The responsibility of national governments for “marketisation” and the drive for privatisation in higher education is sometimes underestimated, both within and outside academia. Reforms aimed at privatisation are very often the result of government intervention in the management of universities, and have been imposed from the top down. This has been done by governments of both the centre-right and the centre-left.
Similarly, resistance to these trends comes from both a diverse alliance of the radical-left, who draw on theories of financialisation and neo-liberalism to explain our current economic situation, and from more conservative scholars who see themselves as the protectors of ancient academic tradition.
As young scholars, we are part of the university’s future. It seems evident to us that we should ask questions about what universities are for. But in so doing, we must not forget to ask another, bolder question: “what should universities be?”
There is no “going back” to any perceived golden age, but it is beyond doubt that there are aspects both of the academic tradition and of the post-war ideal of affordable or free higher education that are worth defending. As institutions charged with the important task of producing new knowledge, universities should not be desperately mimicking already outdated forms of corporate organisation, but rather be leading the way towards something better.
This article was written with the assistance of Tiago Matos, Kimon Markatos, Hannah Elsisi and Tommaso Giordani. It is part of a series on Universities at the crossroads.
Parasites, pedants and superfluous men and women. Those are some of the accusations that have been levelled against historians and humanities scholars, according to Anthony Grafton, former president of the American Historical Association. He argues in favour of a general education, rooted in the humanities, that can make students independent and analytic thinkers. This is something that, he says:
Matters more than ever in the current media world, in which lies about the past, like lies about the present, move faster than ever before.
Academics around the world widely share these views. Some European educators, such as the founder of the New College of the Humanities in London, A C Grayling, claim a broader agenda of learning through the reference to the value of the humanities in “rendering people fit to deal with unpredictable … challenges”.
Yet while the liberal arts tradition of “general education” remains embedded into American higher education, in Europe, it has been left aside.
The US inherited the “liberal education” pattern from Britain, originally designed for the education of privileged youngsters. It was based on a complete and well-organised introduction to human knowledge in art, literature, science, and social life, through an overview of classical studies and the knowledge of western intellectual tradition.
In the 20th century, facing an increase in student numbers in secondary and tertiary education as populations expanded, several US reformers argued that the extension of access to a common body of information and ideas was more important than splitting curricula up into different vocations. It would be better for the democratisation of tertiary education rolled out to the masses, they argued.
Their most influential document was the report General Education in a Free Society, prepared between 1943-45 by a group of Harvard faculty members and inspired by their president James Bryant Conant, an advocate for equal opportunity and meritocracy in intellectual careers.
The committee’s objective was a reform of Harvard’s curricula, but its conclusions involved the American education system as a whole and have had a lasting impact. In the struggle of American civilisation against the totalitarian threat of World War II, they said that a general introduction to western cultural heritage would help foster the necessary qualities for free and responsible citizenship.
They argued that reflection and dialogue on great ideas of the past were the bases for critical thinking and for the identification with common values. A “well-rounded” general preparation was important to acquire the flexibility of mind, self-knowledge, and understanding of the world needed to choose a profession. And college programmes based on common subjects rather than on elective choices would facilitate the academic integration of gifted students, regardless of their background.
The committee’s proposals centred on the connection between comprehensive high schools, designed for universal attendance, and post-secondary curricula. They wanted to integrate vocational programmes within a set of courses devoted to a dynamic presentation of the realisations of human knowledge.
The idea that general education was a tool for a truly democratic school system influenced post-war federal policy. A report called Higher Education in American Democracy, prepared in 1947 by a commission appointed by President Harry Truman, suggested all levels of education were aimed at “a fuller realisation of democracy”, “international understanding” and “the application of … trained intelligence to the solution of … problems”.
This was to be achieved through the administration of a broad and well-organised set of non-vocational subjects. After the 1957 “Sputnik shock” – major curricula reform sparked by the Russians being first to launch a satellite – funding programmes for the improvement of the US education system followed some of these guidelines.
In the same period, American public diplomats tried to influence education reforms in Western Europe, in view of the integration of North-Atlantic school systems and their cooperation in cold-war competition. Not by chance, in the 1950s Conant and his collaborators visited West Germany, Italy, Britain, and Switzerland as policy advisers.
They argued that European reformers needed to delay the choice between academic and vocational training – made when pupils were about 11. They also thought Europe’s education systems should reduce the strong distinction among traditionally academic and purely vocational secondary school curricula, still characterised by the presence of privileged subjects for admission to university and by the reference to the study of Latin and literature as an element of selection rather than inclusion.
Their advice to Europe was also to lessen the specialisation of university faculties, which were still designed for the advanced preparation of an elite group of professional intellectuals. Instead, higher education should be transformed into a moment to complete the cultural and personal development for a growing number of students.
Despite the extension of compulsory schooling, European education maintained a higher fragmentation of curricula. Reformers could not obtain the integration of all school cycles within a well-defined project of learning proposed by the US example. In fact, an agreement on further changes among political leaders proved to be hard to achieve. Reformers also faced the opposition of several conservative education professionals.
These deep-rooted differences are still clear today. Even vocal critics of American universities say “liberal arts” programs “are still the best that higher education offers” and represent a wise investment, compared with “majors in fields like furniture design”.
As for Europe, some scholars now believe that the Bologna Process – an ongoing project to make higher education comparable across Europe – is inspired by a misconceived “American model”. They argue that it has been built around concepts of “employability” and the “student-as-customer”, and promotes further specialisation of training.
To counteract this, the education historian, Jesper Eckhardt Larsen, has argued that the American liberal arts tradition “facilitates a breadth of cultivation … [which] is relevant for life rather than just for work”. It may be a good starting point to re-orient European higher education policies.
This article is part of our series, Universities at the crossroads.
In the context of current debates over free speech on campus and the privatization of the government’s student loan assets, here’s a piece – recently published at The Conversation, an excellent site co-sponsored by the University of Birmingham – which tries to set current events into historical and international perspective.
Universities around the world today face pressure to conform to economic rationality and contribute to national innovation. Though often presented as a revolution, driven by “globalisation” or other vague buzzwords, this is nothing new. Research and teaching have never been free from external constraints and public universities have long been expected to justify the resources society devotes to them.
But universities feel threatened and increasingly incapable of fulfilling their primary functions. The question at the centre of most current debates on university reform is to what extent universities themselves should determine the goals, values and norms of pedagogical and scientific practice. For politicians and the general public, academic freedom – even as a noble principle honoured mainly in the breach – is becoming meaningless.
Debates on the freedom of higher education are as old as the university. But today’s ideologically imposed constraints are very different from the financial dependence of public universities on the state after 1945. The current international trend towards semi-private, semi-public universities poses new challenges to academic freedom. This is exemplified by the dominance of market-based vocabulary and principles for scientific conduct.
And the adoption of corporate management models is leading to the authoritarian concentration of power within universities. Critical voices opposed to current reforms argue that intellectual autonomy is being sacrificed to an unworkable vision of financial autonomy for public universities.
These debates are at the heart of a collection of articles on The Conversation. The pieces shed much needed historical light on the current restructuring of higher education and research – in Europe and beyond. They emerge from a recent major conference on higher learning and politics.
The cross-national historical comparisons presented here illustrate the peculiarities of the current reform culture. They also demonstrate the historical complexity of the relationship between university and society, and warn against national parochialism. When told there is no alternative, we should look abroad for ready proof to the contrary.
Higher education, society, politics, and the market have had very different interconnections in different countries. As a result, despite the wide influence of marketisation ideology, there are real differences around the world reflected in public discussions on the future of the universities. We give a flavour of that variety here.
The public universities of contemporary Europe date from 1945, yet they are based on the early 19th century Humboldtian ideal of academic freedom, and on the value of faculty members who both teach and conduct research. Spreading round the world, this model gave rise to numerous local variations, including in the Anglo-American sphere, which in the 20th century overtook the German-French universities.
Today, the dominance of English-language universities is evident in many different regions of the world. Yet as the article on Japan in this series on will illustrate, the mix of internationally circulating university models and national traditions of higher education has produced very different results. Despite pressure to homogenise, the introduction of marketising principles of university management has provoked very different reactions around the world.
As Italian historian Andrea Mariuzzo shows, idealisation of elite American universities is nothing new in global higher education. But nor is misrepresentation of the US system in order to justify various national projects. Mariuzzo examines Harvard reformers’ efforts in 1945 to define the balance between general liberal education designed to produce citizens, and specialised instruction supposedly aimed at economic success.
Meanwhile, Japanese historian Shigeru Okayama describes how European models of higher education influenced the Japanese approach from its inception. But he also exposes the failures of the private university system there, and the growing divide between English and Japanese language teaching.
A collective of doctoral researchers at the European University Institute have also provided a view “from below”, explaining how the marketised university is experienced by those who represent its future.
It is undeniable that some of the current challenges to higher education are specific to our times. But others have a long history, despite being widely seen as new. We often hear that the university is globalising. In fact the nation state remains a key player, and global academia remains primarily a space for international competition.
Within this space, all kinds of international honours contribute to national prestige, and individual scholars mobilise international recognition for national purposes. Distinguishing between which reforms are truly new and which are merely presented as such, and grasping the interplay between global trends and national situations will help us think about how to react in the face of today’s challenges.
This is the first in our series, Universities at the crossroads.
The Week 9 Modern and Contemporary History Research Seminar is on Wednesday 11 March 2015, at 16:15h in the Rodney Hilton Library. It will be delivered by:
Abstract: Frontiers, ‘ungoverned spaces’, and ‘failed states’ continue to exert a powerful grip on the imagination of states and their agencies, as recent events in Iraq and Syria have demonstrated. Throughout the nineteenth century, the British colonial state in India viewed its northwest ‘frontier’ with Afghanistan in similar terms – as a violent geography, an ‘uncivilised’ ‘outlaw state’, ineffectively governed, prone to perpetual bouts of conflict, and as the originator of violent resistance groups and regional conspiracies. This early draft paper, which draws upon a chapter from a forthcoming book, considers how colonial officials constructed imperial space on India’s northwest territories in the mid-nineteenth century (modern day Pakistan). The paper explores how this spatial regime – an amalgam of colonial knowledge, new ‘scientific’ modes of governmentality, and wider imperial visions of global order – evolved and shaped British relations with Afghanistan during this period, eventually leading to war in 1878. In particular, the paper shows how a new and vocal military ‘epistemic community’ was able to capture policy discussions which became couched in terms that prioritised imperial defence, signalling a shift away from earlier forms of ‘embodied’ colonial knowledge, towards a more generic, ‘disembodied’, and utilitarian style of policy advice. This rationalization of geopolitical space, and the policy advice it spurned, put further distance between the myth and reality of colonial authority, particularly on the fringes of empire. The paper, although largely empirical in nature, offers new avenues of exploration for those interested in the construction of political space by powerful actors, and for those interested in imperialism and International Relations more broadly. It also offers an historical perspective on the enduring urge to rationalise contemporary imaginative geographies of disorder and violence.
All are welcome, and there will be drinks.
In the first of a series of occasional pieces by speakers in the Modern & Contemporary History Research seminar series, Dr. Steve Hewitt (Birmingham) reflects on the history of state surveillance ahead of his talk on Wednesday.
Surveillance does not affect everyone equally. Since Edward Snowden made his initial flight to Hong Kong with a treasure trove of documents digitally stuffed in his computer, stories about the surveillance reach of the modern technological state have abounded and continue to appear on a regular basis.
Some accounts focus on generalized surveillance on a global scale; others emerge as of particular interest to certain nations. The latter includes the case of Canada and the Communications Security Establishment Canada (CSEC) trial involving intercepting Wi-Fi transmissions at a Canadian airport. Another example is GCHQ carrying out intercepts in the United Kingdom without seeking a warrant to do so.
There is a clear fascination with the technology and the scale of the surveillance and the notion that the risk is equivalent for all of us.
This discourse, however, obscures important points. First, the idea of equality in the face of Big Brother’s perpetual gaze in a “panoptic society” is, in some respects, ridiculous. While it is certainly true that all may see their communication intercepted, the key point frequently forgotten in the frenzy of discussion is what happens to the material collected. At this point, the idea of equality breaks down as notions of threat and deviance emerge. A version of what sociologist David Lyon refers to as “social sorting” comes into play. Specifically Lyon argues that:
The key practice here is that of producing coded categories through which persons and groups of persons may be sorted. If personal data can be extracted, combined, and extrapolated in order to create profiles of potential consumers for targeted marketing purposes, then, by a similar logic, such data can be similarly processed in order to identify and isolate groups and persons that may be thought of as potential perpetrators of “terrorist” acts. Such “social sorting” has become a standard way of discriminating between different persons and groups for the purposes of providing differential treatment (whether this is encouraging certain classes of consumer to believe that they are eligible for certain exclusive benefits, for example, through club registration and membership, or facilitating or restricting traffic flow though airports by reference to watch lists and APNR data).
To put it in more real world terms, I as a white Euro-Canadian middle class male, of slightly left-of-centre political views and agnostic religious beliefs, have little to fear from blanket surveillance. Conversely, a change to one or several of those characteristics, such as religious belief, and suddenly a convergence can occur with the characteristics of a marginalized category, which has been mapped onto a “threat” by the structures of power. As a result, this shift can lead to far more intrusive surveillance and direct consequences, as opposed to simply the collection of data.
A double standard exists then, in terms of who is generally on the receiving end of intrusions from the state versus what actually gets public attention. One can even see such dynamics at work around the Snowden leak. When David Miranda, the husband of journalist Glenn Greenwald who broke the Snowden story, was detained in 2013, under Schedule 7 of the Terrorism Act 2000, for several hours of questioning at Heathrow Airport, there was a widespread outcry among the media, civil libertarians, and politicians. Receiving far less attention or response was a subsequent Guardian story revealing that 56,000 people had been detained the same way in 2012. It is not hard to guess at the dominant characteristics of those tens of thousands who found themselves detained for questioning under Schedule 7.
Accordingly then, certain groups and individuals have long been subjected to more intrusive surveillance and very serious consequences because of their ideology, race and ethnicity, gender or sexuality, religion or nationality – or some combination of these factors. At one time during the Cold War such attention was generated under the official rubric of counter-subversion. Since 9/11, meanwhile, the justification has become counter-terrorism or counter-extremism.
Hence, the phenomenon is not new, although arguably, thanks to technology, the scale of what the state can collect has vastly increased.
David Lyon, “Surveillance as social sorting: Computer codes and mobile bodies,” in David Lyon, ed, Surveillance as Social Sorting: Privacy, Risk and Automated Discrimination (New York: Routledge, 2005) at 16.
 David Lyon, “Airport Screening, Surveillance, and Social Sorting: Canadian Responses to 9/11 in Context” (2006) 48:3 Can J Corr 404.
More info on other funding for MA study is here: