How is Asutosh College of Geology Honors

research research 1 / / 2018


1 research The magazine of the Deutsche Forschungsgemeinschaft 1/2018 Title: Photo Heeke / exhibition catalog Spectakel der Macht, Darmstadt, 2008 Counting stick (Manina) and ballot (ballotte) from Venice in the late 18th century a vivid example of decisions that followed the principle of chance . Leibniz Award 2018 Rembert Unterstell Clear the stage! 26 Excellent top performance, premiere for the new Federal Research Minister Cross-section of news and reports from the DFG 28 Heinz Maier-Leibnitz Prizes India: Increased Cooperation +++ Africa: Next Einstein Forum +++ University Medicine: Advanced Clinician Scientist Program +++ Report on the DFG review system

2 2 Comment research 1/2018 research 1 / Peter Strohschneider Shaping the world change Digitization and digitality are changing the world and changing the sciences with incalculable opportunities for knowledge, but also enormous challenges. Last but not least, they must be faced by science-led research funding. Its principle is that the world is changing. That the sciences change the world is in the same way their self-determination and their promise. Of course, they themselves are also changing and are in turn exposed to constant change. We are currently experiencing such a multiple and multiple interlocking world change with digitization and digitality. It means and encompasses much more than what is sometimes used in science, but also in politics, to deal with digitality, namely by shortening it to technical catchphrases or to questions of innovation, education, legal or democracy policy, for the sake of circumstances. they were now called Industry 4.0 or broadband expansion, tablets in elementary school or artificial intelligence, network search law or e-democracy. The possibilities and tasks that are associated with computerization and with the becoming digital of texts, images, sounds and things as well as with the ubiquitous networking of everything on the Internet are much more complex and reach much deeper and further. In all seriousness it deals with the questions of individuality and collectivity, of economy and society, of state and law, of knowledge and power. And of course also about the questions of science. This rapid and extensive change in the world confronts us with the experience that Søren Kirkegaard at the beginning of ultra-modernism put into the formula that life can only be understood backwards, but must be lived forward. We live in this epoch-making change and help shape it in a ignorant way, but at the same time it remains obscure and often incomprehensible to us. We act in the midst of an abundance of information under conditions of serious information deficiencies and dramatic uncertainty. At most, the prospect that this article is the revised version of the speech by the DFG President at the DFG's New Year's reception on January 15, 2018 in Berlin seems reliable. linear extrapolations of what we already know miss the future. In any case, it will be different from what the utopias promise and the dystopias threaten. This is because the world change of digitality will neither come to an end in the foreseeable future nor can it be objectively limited. He does not have the shape of a problem for which final solutions or even the one and only solution can be found. Such constellations of digitality and sociality, of having to act, openness to goals and uncertainty also determine the sciences, and science administration, research funding and science policy are no less concerned with them. Research is decisively driving this change in the world: among other things, in mathematics or computer science, in materials science or those new fields for which we currently have the vague term data sciences. The sciences, like all other parts of society, are subject to the changing world with all the incalculable opportunities for knowledge and design tasks that result from it. Research as a historically evolved and thus changeable concept is transformed by digitality in so far as previously analog data is available digitally and can thus be processed with new processes and questions. In the life sciences or physics, data-intensive technologies make completely new forms of research possible in the first place. At the same time, established forms of research are being substituted; Across entire areas of science, one can observe how new mathematical processes are gaining in importance instead of them. From these questions, methods and research practices, one quickly comes to a change in what can be understood under epistemological terms under research and science and what, in sociological terms, is part of the prerequisites and conditions of their practice. What actually is research, knowledge, a scientific argument, a proof or evidence when the distinction between correlation and causality becomes blurred or algorithms take the place of theories? For example, when an experiment is replaced by digital simulation; when a neo-positivistic belief in numbers dominates, which no longer knows how to distinguish between the possible uniqueness of data, the controversial nature of their interpretations and the ambivalence of their social consequences; if the research result is due to an algorithm, which in turn is the result of machine learning processes. Such epistemological shifts are then followed by questions of the sociology of science. After all, what is a research achievement when thinking begins to be automated? How is it allocated individually? What will establish a scientific reputation in the future? Legal and financial questions Who is responsible for research, who is liable for its consequences? How is this legally regulated? How financially and economically? not to mention. Photo: DFG / Ausserhofer These and other questions, including good scientific practice, the publication system and research ethics, pose new challenges for science policy and administration, and not least for science-based research funding. Just to indicate the radical nature of the digital change here: The entire funding system, not just the DFG, relies on scientific peers on whose participation all funding decisions are based. But is their judgment irreplaceable? Couldn't funding decisions also be automated, i.e. made on the basis of algorithms that rank the project applications? We would then get by with a few clerks and an IT support group. But: The incentives to optimize project applications in an algorithmically streamlined manner would be incalculable, as would the consequences for the quality and originality of the research probably. But above all: Wouldn't a weakening of the justification and comprehensibility of funding decisions be the price of such automation and therefore a loss of legitimacy? In this respect, too, the change in the world of digitality is an enormous design task. The order of knowledge and the social order of science, the epistemic, economic, financial, legal and political aspects of this change influence each other in many ways and must be considered in their complex and contingent contexts. It is precisely this requirement that a particularly prominent attempt to shape global change in research policy seems to fail, however: the so-called 3-O strategy (Open Science, Open Innovation, Open to the World), as developed by Research Commissioner Carlos Moedas on behalf of the European Commission has made the current technical level of digitization the benchmark for science policy programs. At the same time, the ideological catchphrase of openness hides the openness and unpredictability of the world change rather than taking it conceptually seriously and pointing its political design in a wise direction. As a funding institution as well as a self-governing organization of the sciences in the Federal Republic, the DFG sees its responsibility here. We plan to do it justice in three ways: First, in the opening of forums and in the

3 4 Comment / In focus research 1/2018 research 1 / Stocktaking of digital change DFG-internal project collects information on specialist cultures and funding activities A long-term project at the Bonn DFG head office aims to position the DFG on digital change in the sciences prepare. This was preceded by a decision by the DFG Senate. The positioning is intended to attempt a discipline-related inventory, analyze funding activities and discuss deductions for policy advice. The focus is on accompanying subject-specific reflection on digital change in all areas of science (and what is called subject-specific, will change itself under the conditions of digitality). Second, our promotional activities and their instruments and procedures will have to be further developed. And thirdly, the DFG will face new tasks in advising politics and society with regard to the development of science in the digital age. In order to be able to meet this threefold responsibility, we are carrying out an extensive structuring project in the DFG office (see article above). The Presidium will also set up a high-ranking expert commission on science in the digital age, and we will also try to support the digital change in the sciences in other organizational forms. In doing so, however, we are guided by the view that research will continue to be important in the constitutionally more precise sense of a specifically professionalized form of free methodical search for truth. It continues to require public sponsorship and funding in the data sciences as well. And this must legitimately take place in the form of an initially six-month concept phase, the survey of wide-ranging expert knowledge in the DFG and the analysis of previous experience in funding practice in order to prepare further activities. The evaluations so far suggest nine relevant dimensions for the digital change in the sciences, which range from a scientific practical over an institutional and ethical to a commercial and political dimension. In addition, at least three forms of digital change must be distinguished in this field: a transformative digital change, here called digitization, an enabling digital change through devices, new technologies and a substitute digital change. The project will be continued in 2018 on this basis. Funding decisions that will continue to be based on human judgment that cannot be automated. Admittedly, such judgment is not simply given. Rather, it is given up to us. It has to be cultivated and it requires institutional freedom in order to be able to develop. Without it, there would be no accountability of research achievements and no justification for funding decisions, which are constitutive for the DFG. The complex entanglement of technological, epistemic and social that exists in everything digital cannot simply be skipped digitally. Productive and fascinating research that justifies social trust in them is far too prerequisite for this. With this in mind, science should and should, in the DFG, tackle the tasks of shaping science in the digital age. They are tasks of action as well as tasks of knowledge! Prof. Dr. Peter Strohschneider is President of the German Research Foundation. Information infrastructures of the future New position paper on the orientation of funding in the field of scientific literature supply and information systems The DFG has adopted a new position paper on the future development of scientific information infrastructures in the context of digitization, open access transformation and research data management. In its March meeting, the Senate approved the strategy paper on the promotion of information infrastructures for science, prepared by the Committee for Scientific Libraries and Information Systems (AWBI). In view of the extensive digital change in the disciplines and an unchanged high level of change dynamics, this analyzes the initial situation, defines challenges and priority fields of action and offers Allianz a guideline for funding in the field of scientific literature supply and information systems (LIS). As a self-governing organization of science and national research funding organization, the DFG is actively helping to shape the digital change in the sciences, emphasizes President Prof. Dr. Peter straw cutter. For this purpose, the paper describes the current requirements in the environment of a data-intensive and networked science as well as the growing requirements for coordination and cooperation on several levels: within the scientific communities, between the infrastructure facilities and between infrastructure and science. On a technical level, the paper primarily deals with three funding areas: the development and digitization of information resources, the open access transformation and the area of ​​research data. The position paper is an important element in the DFG's strategy of systematically accompanying the ubiquitous effects of digital change on the sciences, assessing the opportunities and risks and thus aligning the DFG's actions with the interests of science and universities, emphasizes Strohschneider. programs / lis / positionspapier_informationsinfrastructure.pdf Priority Initiative Digital Sequence Information European Open Science Cloud Digitization and digitality in science are also a topic for the Alliance of Science Organizations, which recently dealt with different facets three times. Priority initiative digital information realigned and extended until 2022. It is now even more up-to-date and takes greater account of the fact that science can no longer be imagined without digital data and communication. The alliance is concerned about efforts to ensure that in future the use of digital sequence information from genetic resources should also be subject to the regulations of the Nagoya Protocol and the Convention on Biological Diversity; this could have far-reaching graphic: Shutterstock implications for the international environmental and life sciences. In principle, however, the organizations welcome the European Open Science Cloud Initiative of the EU Commission. At the same time, they demand an appropriate balance between scientific and political interests. The priority initiative and the statements of the alliance also under

4 6 Humanities and social sciences research 1/2018 research 1 / Barbara Stollberg-Rilinger Difficult lot Distribution of goods, punishments or elections to offices in early modern Europe were decided at random. Considered as a communicative process and time-bound symbolic practice, this type of decision-making is also a piece of the mosaic for the cultural history of the political. Counting stick (Manina) and ballot (ballot), Venice, Photo: Heeke / exhibition catalog Spectacle of Power, Darmstadt, 2008, p. 81

5 8 Humanities and social sciences research 1/2018 research 1 / Some people will still remember that at the beginning of the NSU murder trial in Munich in May 2013, the limited number of journalists was drawn with the result that the Frankfurter Allgemeine and Süddeutsche Zeitung were among the Losers, Brigitte and Hello Munich! but were among the winners. The lot treated all candidates equally, which was perceived as scandalous, because the media that competed with one another for the places were highly unequally qualified for the task. The result was a public outcry: Should and should such an important decision become a lottery game? Why that was so is obvious. For us, decision-making is generally based on the rational weighing of reasons, on the determination of what is true, good and right. We want to plan and design things sensibly, and create confidence in expectations. Decision by lot, on the other hand, means surrendering to chance. It relieves you of all considerations, of advice, negotiation and compromise, but also of personal influence and existing power constellations. Before the lot, all options are the same; the dice are impartial. Lot is the epitome of unavailability. However, as unreasonable as the principle of chance appears at first glance, it can still be rational under certain conditions, namely when the options are actually completely the same or when, conversely, they are not comparable.Or when there is an unmanageable multitude of competing criteria for a correct decision, when the necessary time is not available or when the costs of determining the best option are excessively high, in short: when it is more important to make a decision than to make the right one decide. In addition, the lot creates equality among the lottery winners and is therefore an instrument of democratic participation. Lately, some political theorists have been advocating the establishment of committees. Two mercenaries, on the right under the gallows tree, roll the dice for their lives; Jacques Callot, La pendaison / The Hangover, Paris, motif: La pendaison, Jacques Callot / Wiki Commons, in which citizens selected by lot are directly involved in the political decision-making process in order to counteract the loss of legitimacy of parliamentary procedures and political elites. And even scientific funding institutions have recently started thinking about incorporating random elements into their decision-making processes. However, the German Research Foundation is not considering such a thing. And in other ways too, proposals to trigger decisions of great importance are usually not taken seriously and rejected as frivolous. One wonders why, because it wasn't always like that. In earlier epochs people used the dice more often than they do today. Does such a willingness to draw a lot say anything about the societies concerned? And if so, what? Decision-making also has its story. If one understands by decision not (only) an internal, mental process, but a communicative, social occurrence, then different historical cultures of decision-making can be described depending on what is considered decidable and in need of decision-making in a society, how decisions are brought about, but also to be avoided. It is by no means a matter of course that social action is framed, modeled, perceived and presented as decision-making at all. To make a decision means that one explicitly distills a few options for action from the endless sea of ​​possibilities and also explicitly commits oneself to one of them. This specific form of action was dealt with differently at different times, as is the subject of our research project. Historically speaking, making decisions is the less likely case because it is always an unreasonable expectation. One could always decide differently, and at the moment the decision is made, the correctness of the choice made is never guaranteed. This raises problems of legitimation, creates responsibilities and loses face. Therefore, decisions are usually better avoided. Loosing is a possible answer to these unreasonable demands of decision-making. This is because the parties involved place the decision on a level that is not available to them and forego their own power to act, but only within the framework that they set to chance. The lot is organized chance (B. Goodwin). It depends on the question that chance is supposed to decide, election of office by lot in James Harrington's utopian Commonwealth of Oceana, and at which point in a process the lot is used. In early modern Europe there were lottery procedures for very different purposes and in different arrangements. Most of the time it was not, as one might think, a matter of determining the divine will in order to make the only right decision. Photo: Hüskes Motif: The Manner and Use of the Ballot, artist unknown / Wiki Commons The sortilegium, prophecy by lots, was expressly forbidden under Roman canon law since the 13th century. It was considered a sinful, even magical practice, by which one compelled God to reveal something that he did not reveal of himself to human reason. Lots were only allowed if it was purely pragmatic

6 10 Humanities and social sciences research 1/2018 research 1 / On Golgotha ​​at the gates of Jerusalem: mercenaries roll the dice for the cloak of Jesus (detail below right); Lucas Cranach the Elder, Crucifixion, Motif: Crucifixion, Lucas Cranach the Elder / Wiki Commons / Sailko tischer human agreement and left God out of the game. That did not rule out that people still believed in supernatural participation when drawing lots. But that was not the point in the proceedings. The lot was used, for example, to distribute goods or loads: which of several equal heirs gets which piece of land? Or: Which doctor is sent to the sick during a plague epidemic? Or also: Which soldier will be executed pars pro toto if the whole troop has refused the order? Lottery elements were most often used when filling public offices. Ancient Athens and the medieval city republics of Venice and Florence are just the most famous, but by no means the only examples. A lottery was drawn, for example, in Osnabrück and Münster, in Minden and Unna, in Utrecht, Rotterdam and Deventer, in Bern, Basel and Geneva, in Bremen, Hamburg and Frankfurt / Main. Loose was by no means, as Aristotle once said, a sign of democratic equality. The targeted use of chance did not mean that the procedures as a whole were beyond the control and control of the elites. It all came down to the specific framing; it made the contingency of the lottery procedure manageable as a whole. The annual mayor and council elections in the premodern cities were essentially rotation and cooptation processes within a fixed circle of council families, not free elections in which all citizens would have drawn the same amount. There are voting instruments from Basel for the drawing of public offices, 17./18. Century. there were countless variants of the process, some of which were dizzying in complexity. Typically, individual electors were drawn from an existing committee, who in turn nominated candidates, among whom the drawing was repeated. In Münster, for example, electoral regulations in 1721 stipulated that the councilors initially cast five district heads from among their number. They each determine eight electors, so-called members of the electorate; these 40 members of the electorate drew ten from among their number; these ten again appointed 20 members of the spa; these 20 rolled ten among themselves, and these ten eventually elected the new council. Why the whole thing? It is significant that such elements of chance were mostly introduced in situations of crisis, when the urban elite was torn apart by internal parties and their legitimacy was called into question by the common citizenry. The random principle should eliminate the influence of internal parties and clientele structures and fight corruption Photo: Heeke / exhibition catalog Spektakel der Macht, Darmstadt, 2008, p. 35. It was hoped that this would restore political stability, from which the long-established elites benefited most. The lot was supposed to make this possible because it had at least three effects: On the one hand, it made it incalculable who was actually allowed to vote in the end, and in this way made voting and voting more difficult. On the other hand, it involved more people than potential voters in the process and thus increased the legitimacy of the result. Because whoever is included is less inclined to contradict them later. The lot finally protected the losers from losing face and spared their honor at that time, one of the highest and most conflict-prone goods. Conflicts could easily escalate because the early modern cities had little executive power. That is why a great deal of effort was made to create harmony and consensus facades. The imposition of decision-making was particularly high here, because decisions make dissent clearly visible. It therefore seems characteristic of the specific culture of decision-making in early modern cities that they so often used lots in their complex electoral processes. However, chance was viewed with growing unease as a decision-making aid during the 18th century. A renowned jurist found, for example, that it was a kind of disgrace and humiliation when such a means was made necessary by the laws and constitution of a state. After all, how encrusted and corrupt must a community be if you forego rational weighing and prefer to take refuge in blind chance? The lot now appeared as a declaration of bankruptcy of rational decision-making. The greater the confidence in the rationality of human action, the more frivolous it seems to lose. It is therefore no coincidence that today, when this confidence is fading more and more, there is so much talk of lots again. Photo: WWU Münster / Grewer Prof. Dr. Barbara Stollberg-Rilinger holds the Münster Chair for Early Modern History. The Gottfried Wilhelm Leibniz Prize winner of the DFG 2005 is designated rector of the Berlin Wissenschaftskolleg in the year Address: Historical Seminar of the WWU Münster, Domplatz 20 22, Münster Decision by lot in the Middle Ages and Early Modern Times is a sub-project of the SFB 1150 Cultures of Decision at the University of Munster.

7 12 Life sciences research 1/2018 research 1 / Dieter Willbold and Silke Hoffmann A look at the viral construction kit Using the HIV infection, structural biologists want to better understand the interrelationships between the disease-causing virus and its host. Does this result in new therapeutic approaches? In any case, insights into basic cell mechanisms. They are masters of manipulation, impressive quick-change artists and true minimalists: viruses. Even if they can be highly successful, they have neither an independent metabolism nor an independent reproduction. Viruses are tiny, about 20 to a few hundred millionths of a millimeter in size, and live parasitically. After an infection, i.e. infection, its associated host compensates for its deficits almost involuntarily. But viruses are not only a nuisance for the host, they are sometimes even an angel of death. Viruses are picky and usually only use very specific cell types. For example, human immunodeficiency viruses of type 1 (HIV-1) require special signs in the form of surface molecules such as CD4 on their target cells. These are found on certain cells of the immune system. In simple terms, one could think that HIV-1 mainly damages these cell types that it can infect. Not even close! Because immune cells are chatty and are in constant communication with their environment through the release of messenger substances. In addition, they are very mobile, they circulate through the whole organism. Illustration: Science Photo Library / Nicolle Fuller. In the case of an acute HIV-1 infection, there is a mass panic, which also kills immune cells that cannot be attacked by the HI virus. Acquired immunodeficiency syndrome AIDS develops, which in the later course of the disease is also associated with neurocognitive disorders. Whenever a virus-host interrelation is mentioned, it is about the points of contact between a virus and its host and vice versa. If you consider the few individual building blocks that a virus can bring into the relationship as a partner, one expects a clear and easily transparent network of interactions. But here, too, the truth is far more complicated. Hardly by chance has there been no cure in sight since the first description of HIV-1 in 1983. Around 37 million people around the world are currently living with HIV-1. Only around half have access to antiviral therapy, and in 2016 alone there were around 1 million HIV deaths. The HIV-1 genome, which consists of only nine genes, contains the information for fewer than 20 proteins, only about one thousandth of the protein molecules recorded in the human genome. Underneath are the proteins that make up the virus particle as well as the catalytically active molecules that mediate such reactions, for links: artistically colored illustration of autophagy, the cell's own waste and recycling system. Right: 900 MHz NMR spectrometer in the Biomolecular NMR Center in Jülich. which the host does not provide a cellular molecular machine. The viral kit is completed by a number of so-called regulatory and accessory elements, which, among other things, skillfully defy the host cellular defense mechanisms through a variety of strategies. The so-called accessory proteins are dispensable in cell culture model systems for viral replication. Here the virus only deals with the individual infected cell and not with the entire host organism and its immune system. For the successful infection of the host organism, however, the accessory proteins play a decisive role. One of them is Nef (negative factor). Patients who carry virus isolates with a missing or defective Nef gene belong to the group of LTNPs (long time non-progressors), so they can be infected with HIV-1 for decades without developing the symptoms typical of AIDS. Nef is therefore essential for the pathogenicity, the disease-causing effect, of HIV-1. From a structural biological point of view, Nef is a flexible, changeable protein, developed to interact with a large number of other proteins. This is why this multifunctional protein is also called the master manipulator. Equipped with a lipid anchor, it tends to attach to membranes. Of particular interest are the changes in the equipping of the plasma membrane with surface receptors caused by Nef and the immensely increased release of extracellular vesicles (exocytosis). Photo: Research Center Jülich (ICS-6)

8 14 Life sciences research 1/2018 research 1 / These lipid-coated packets are also taken up by cells that cannot be infected by HIV, and thus bring Nef and its assembly instructions, the Nef-encoding messenger ribonucleic acid (mrna), to all sorts of others Cells. This makes various surface molecules disappear and tricks the host organism's immune defense. The basic molecular mechanisms are not yet understood in detail. How does Nef get efficiently to its important intracellular site of action, the plasma membrane? And how does Nef drive the exocytosis machinery? This is where the story of the DFG-funded SFB 1208 sub-project Subversion of Host Cell Vesicle Trafficking: Hijacking of Autophagy-Related Proteins by HIV-1 Nef begins. We have long been interested in accessory viral proteins, especially those encoded by HIV. Neurodegeneration and autophagy are now central topics in our structural biology institute. And so one day the idea arose to look for new interaction partners of HIV-1 Nef in the brain. The idea was to identify those interactions that may be related to the development of HIV-1-associated neurocognitive disorders. Against this background, Nef was bound to a membrane in a special yeast cell system and this membrane-associated Nef was offered a different human protein as an active partner in each yeast cell. A hit was indicated by a specific color in the corresponding yeast colony. Among other things, this experiment gave the team a protein called GABARAPL2 (from GABARAP-like 2) as a Nef interaction partner. That was very interesting because GABARAPL2, like GABARAP, belongs to a group of proteins that are involved in the process of autophagy. In autophagy, small membrane-enveloped vesicles, so-called autophagosomes, which are no longer required or which enclose defective cell components, are constantly forming in the cell's plasma. They then fuse with another type of vesicle called lysosomes, which contain enzymes that break the contents down into small building blocks. These are then available to the cell again. So it is a question of cellular recycling. Autophagy is a vital process. New biological molecules are constantly being produced in the cellular production facilities, which also results in waste: misformed, clumped or simply excess proteins, defective mitochondria, metabolic products and much more. If this garbage is not disposed of, it can cause damage. Presumably, autophagy is much more than just a sophisticated recycling system. Autophagy appears to play an important role in a variety of diseases. In addition, it is important as a cellular component in the immune response. To put it bluntly: if intruders such as bacteria or viruses overcome the body's first line of defense and get into the cell, the recycling system can track them down and take them out of circulation elegantly. Some pathogens have developed evolutionary strategies to evade the cellular cleaning crew, for example to stop them. This also applies to HIV-1. Depending on the cell type, it has different effects on autophagy in order to ensure its survival in the cell. But that's not all: HIV-1 even hijacks parts of the autophagy machinery and makes use of their services. This is where the interaction of Nef with GABARAPL2 and the surface view of the protein GABARAP (in yellow) play a role. The positions of the sticky ligand binding pockets HP1 and HP2 are indicated. The areas contacted by HIV-1 Nef are highlighted in color: the most affected areas in magenta, those slightly affected in orange. Comparison of the intracellular localization of HIV-1 Nef (in red) in the presence (left) and after knock-down (right) of the GABARAPs.its relatives (collectively as GABARAPs) play a role. Once tracked down, interest first turned to the molecular basis of this interaction. The team had previously determined the three-dimensional structure of its close relative, GABARAP, using nuclear magnetic resonance (NMR) spectroscopy. In addition, there was knowledge of what his interaction partner had to bring in order to fit well into his two sticky (hydrophobic) binding pockets. Since the 3-D structure of GABARAP and the associated data sets were already available, it was relatively easy to identify the contact area for Nef on the part of GABARAP. In the meantime it has been shown in the cell culture system that Nef needs at least one of the GABARAPs in order to find the inside of the cytoplasmic membrane. Many questions are still open: How exactly do the GABARAPs help Nef to find the way to the plasma membrane? How does this affect their composition with surface receptors? Can Nef get out of the cell without GABARAPs? And how exactly are the export packages for Nef, i.e. the extracellular vesicles loaded with Nef, created? Do both processes even partially follow the same mechanism? One idea would be for the GABARAPs to couple the process-relevant vesicles precisely to those miniature express trains that move specifically from the inside to the outside on the cellular rail system, the microtubules. In order to find answers to these questions, interdisciplinary methods are used that range from atomically resolved 3-D views (NMR, X-ray crystallography, cryo-electron microscopy) through cellular techniques (optical microscopy and biochemistry) to genome editing processes and proteome analysis . The knowledge gained in this way is then evaluated in close cooperation with virologists in the biological system. It is often asked whether a new therapeutic approach against HIV-1 can be expected from this work. Maybe, maybe not. In any case, with the help of HIV-1, fundamentals can be learned about human biology and, in particular, something about the process of autophagy in detail. Prof. Dieter Willbold heads the Institute for Physical Chemistry at the University of Düsseldorf and is director responsible for structural biochemistry at Forschungszentrum Jülich. Dr. Silke Hoffmann is the work group leader there. Address: Forschungszentrum Jülich GmbH, Institute of Complex Systems (ICS), Structural Biochemistry (ICS-6), Jülich DFG funding as a sub-project in the SFB 1208 Dynamics of Membrane Systems (Project B02). Images: Research Center Jülich (ICS-6) Photo: FZJ / W. P. Schneider Model: Research Center Jülich (ICS-6)

9 16 Natural sciences research 1/2018 research 1 / Vera Schlindwein course: Ultra-slow Gakkel ridge Plate tectonics, volcanic activity and spreading of the ocean floor in the Arctic: after extensive research expeditions and earthquake measurements, the Emmy Noether group MOVE gains surprising insights into the formation and structure of the ocean lithosphere . A workshop report The research ship sometimes ends up in stormy seas in the Furious Fifties on the Southwest Indian Ridge. Photo: AWI / Florian Schmid

10 18 Natural sciences research 1/2018 research 1 / Frozen in long winters, the Arctic is slow. Progress is arduous and only possible slowly. Polar explorers also need patience and perseverance, yesterday and today. Even deep under the Arctic sea ice, progress is slow - and at an ultra-slow pace: While the world's oceans grow by more than 20 millimeters every year at the seams of the mid-ocean ridges, new ocean floors are formed along the Arctic ridge system and its relative, the Southwest Indian Back (SWIR) halfway between Africa and the Antarctic, with less than 15 millimeters per year. For a long time, ultra-slow mid-ocean ridges hardly played a role in the study of plate tectonics. In addition, the Arctic ridge system is too difficult to reach because of its cover with sea ice; the sea in the Furious Fifties at the SWIR is too rough for ambitious research projects. In addition, it was believed that the much better researched processes of ocean floor formation on slow ridges could be used to infer the ultra-slow ridges, but a huge swarm of earthquakes in the Arctic Ocean made geophysicists sit up and take notice: The earth shook for nine months, sometimes with magnitude 5 Near a large volcano on the Gakkel ridge. Normally, the countless volcanic eruptions on mid-ocean ridges go completely unnoticed: the young ocean lithosphere is too warm, especially in the area of ​​the volcanic ridges, for larger earthquakes that can still be registered more than 1000 kilometers away on land. In addition, volcanic eruptions on ultra-slow backs were considered rare. When the lithospheric plates on mid-ocean ridges maps the ultra-slow splay ridges in the Arctic (left) and in the Southwest Indian Ocean (right). drift apart, the earth's mantle is melted by releasing pressure. The latter then penetrates and, as magma, continuously closes the gap between the plates. Everywhere in the oceans an earth crust about 6 to 8 kilometers thick is created. This motor starts to stutter on the ultra-slow back and only a little melt is produced. But how did this volcanic eruption fit into the image of a region that is said to have a spreading rate of only 9 to 10 millimeters per year? In 2001, an interdisciplinary expedition set sail for the first time with the two icebreakers USGC Healy and FS Polarstern to systematically map the Gakkel Ridge, collect rock samples from the sea floor, measure the thickness of the crusts, search for hot springs on the sea floor and Measure earthquakes on site. This expedition is safe with four publications in Nature. Cards: AG Schlindwein Hard work in the icy cold: a seismometer is set up on an ice floe in the Arctic sea ice. Lich can be described as groundbreaking, but the realization matured that the ultra-slow backs are by no means just slow slow backs, but represent a class of their own. The main characteristic was shown: The thickness of the earth's crust along the ridge varies greatly. While a thin crust and many volcanic structures characterize some stretches of the ridge, in other igneous areas the sea floor is up to 5000 meters deep, and mantle rocks are located directly on the sea floor. These often 100-kilometer-long amagmatic areas without significant volcanism are interrupted by gigantic Photo: AWI / Vera Schlindwein volcanic centers with a mighty earth crust. And just such a volcano seemed to have started in 1999 with a great roar of earthquakes. At this point in time, at the beginning of 2003, the future project manager came into contact with an ultra-slow back for the first time. Nobody really dared to get close to the earthquake data that had been recorded during the trip. The measuring method of placing seismometers on drifting ice floes in order to record earthquakes seemed too unusual. She had worked with unusual seismological data before and was fascinated. The technology worked, and a few small earthquakes could be recorded between the cracks of the ice floes. They indicated that small gas explosions took place near the volcano under the enormous pressure of 4 kilometers of water column. That was a surprise and a motivation to take a close look at the earthquake activity of these ultra-slow backs. The idea for the Emmy Noether project was born. It was supposed to systematically investigate the seismicity of ultra-slow ridges, comparing magmatic and amagmatic ridges. On top of that on different scales from the smallest

11 20 Natural sciences research 1/2018 research 1 / earthquakes that provide information about spreading processes locally, up to large earthquakes that provide large-scale and cross-back information about the formation of the ocean floor. In September 2006, the Junior Research Group Middle Ocean Volcanoes and Earthquakes (MOVE) started. For family reasons, the project was designed for eight years part-time from the start. This also had the advantage that patience and perseverance could be used for the tedious acquisition of the earthquake data; With a time horizon of five years, that would not have been possible in retrospect. Since the data for large earthquakes are publicly available in catalogs, it was possible to start analyzing them. In order to record the smallest earthquakes with seismometers on site, which can provide information about active spreading processes and the structure and temperature of the lithosphere, ship time was required on the RV Polarstern. Initially with piggyback experiments on the RV Polarstern and the IB Oden, earthquake data could be collected from drifting ice floes. Classical measurements with ocean bottom seismometers (OBS) were unavoidable, but impossible in the ice-covered Arctic Ocean, because the OBS reappear somewhere within a kilometer of their dropping position, possibly under an ice floe. The team therefore switched to geologically similar areas of the SWIR in order to be able to use OBS. Since a cost-intensive research ship cannot idly wait for good weather for the OBS rescue, a 35-member interdisciplinary research group set sail on the RV Polarstern in 2013 to research together in the stormy measuring area for a month. At the same time, data on volcanic activity in more moderate latitudes of the SWIR could be collected through two further voyages. Left: Seagulls are reliably interested in ocean floor seismometers, making their search easier. Right: Researchers during a test run with a sea-ice seismometer. Photo: Nataliya Koev Photos: AWI / Vera Schlindwein In the diagram: Elaborately collected earthquake data on the south-west Indian ridge provide information about spreading processes. Seven years after the beginning of MOVE, a comprehensive earthquake dataset that was collected with immense effort (seven ship trips!) Was available. He confirmed the assumption that the earthquake activity on ultra-slow ridges enables surprising insights into the formation and structure of the young ocean lithosphere. After locating over 5000 earthquakes, the team discovered the deepest earthquakes on mid-ocean ridges at a depth of 35 kilometers. They showed that the young ocean lithosphere in the amagmatic areas is much colder than previously assumed. The lithosphere under the volcanoes thins out so that melts at its base can flow from the cold amagmatic areas to the volcanoes. Such a topography was postulated by petrologists (rock researchers) to explain the uneven distribution of melts on ultra-slow ridges. With these results a first geophysical proof of this theory was successful. Another finding was particularly exciting: there were no earthquakes up to a depth of 15 kilometers in the areas where mantle rocks can be found on the sea floor. In contact with water, it creates a very soft rock called serpentinite, which does not break in earthquakes, but rather behaves like soft soap. Conversely, this means that water can penetrate to undreamt-of depths of 15 kilometers and an exchange of substances between the lithosphere and the ocean is conceivable in much larger dimensions. The SWIR also succeeded in examining a volcano that had repeatedly drawn attention to itself with large earthquakes for over ten years. Indeed, a magma chamber was found under the volcano. The OBS recorded live a magma intrusion and its seismic tremor on a rare in-situ measurement of submarine volcanic activity. Even if the young investigator group MOVE only moved slowly, the gain in knowledge it gained was considerable and raised many new questions. As it is well known that polar research is progressing slowly, a follow-up project is already underway: Since 2017, the team has had earthquake recordings from 27 OBS, spread over a ridge area of ​​160 kilometers south of Svalbard. This is the most comprehensive micro-earthquake set for mid-ocean ridges to date. In addition, the prototype of a sea-ice OBS is waiting for its dress rehearsal in order to investigate questions of the exchange of substances between the lithosphere and the ocean with a large interdisciplinary expedition to the hydrothermal spring AURORA on the ice-covered Gakkel ridge, but not before the geophysicist PD Dr. rer. nat. Vera Schlindwein was head of the Emmy Noether Young Investigator Group Middle Ocean Volcanoes and Earthquakes (MOVE) at the Alfred Wegener Institute in Bremerhaven. Address: Alfred Wegener Institute, Helmholtz Center for Polar and Marine Research, Am Handelshafen 12, Bremerhaven DFG funding in the Emmy Noether program of the DFG. Graphics: AG Schlindwein

12 22 Engineering sciences research 1/2018 research 1 / Barbara Perlich and Julia Hurlbeck The cupboard in the east wall Building researchers and preservationists in Erfurt discovered a private Jewish prayer room from the 13th century completely unexpectedly. The first verifiable ensemble of this type north of the Alps reveals a lot about everyday Jewish piety in the Middle Ages. Salomon, iudeus de Werceborc, de curia quondam Riche iudee i sol, so it is recorded in Erfurt's tax list in 1293: Salomon, a Jew from Würzburg, pays a shilling for his court, which formerly belonged to the Jewess Riche. A little later you can find out about this same Salomon von Würzburg: Salman de Erbipoli de curia quodam Richen, iudea de Northusen, sita in platea iudeorum i. Sol the Jewess Riche came from Nordhausen, and the yard is in the platea iudeorum, the Judengasse in Erfurt. The location of the medieval platea iudeorum is well known, as it is today's Rathausgasse behind the neo-Gothic town hall. The neighborhood Bene diktsplatz 1 of our days, which borders on the former platea iudeorum, shows at first glance a building stock that has grown over centuries. During the renovation and renovation work that had been going on in the quarter since 1992, a painted wooden beam ceiling from 1244 was discovered. So the prospect of being able to reconstruct the development of a high medieval residential area and assign the buildings to their former residents was very good. With the conversion as a prayer room, a new entrance, a light niche in the wall and the ceiling were created. The light niche could apparently be closed with a grille so that an oil light could burn unattended. Photo: Elisabeth Nitz Medieval buildings and name chains in the tax lists, the debtors were listed in topographical order. Photos: Barbara Perlich Since the beginning of the research project A High Medieval Jewish Residential and Commercial Complex in Erfurt and its spatial arrangement in the spring of 2015, a team of building researchers, restorers, historians, art historians and Judaists have been able to achieve these goals and gain some amazing insights. Review: In 1222, an entire Erfurt city district burned again. Nevertheless, based on the masonry and the structural forms found, four stone buildings in the investigated district can be assigned to a construction period before the fire of 1222. The Romanesque stone buildings (called kemenaten, from caminata, Latin: equipped with a chimney) usually had a more or less square floor plan and only one room per floor; they were mostly a little further away from the road in the back of the parcels. Immediately after the devastating fire of 1222, a fifth bower was built in the quarter, for the foundations of which fire-damaged stone material from a demolished building was used. This new stone structure corresponded to the standard type with a wooden structure in front and all gates on one side of the building. With external dimensions of around 8 x 8 meters, the building fitted in well with the size of kemenaten customary for Erfurt: It had one room per floor, the basement and ground floor were flat with wooden beam ceilings and there was a cantilever ceiling above the upper floor. The upper floor room did not have any special features that would have set it apart from other upper floor rooms in kemenaten from the 12th and 13th centuries. The non-heatable room was probably used as a bedroom. This inconspicuous room is now being completely redesigned around 1244, twenty years after it was built. The original access to the room in the east wall is given up in favor of a wooden cupboard that has been set. Fastening holes on both sides of the former gate, in which the frame of the cabinet was fastened with metal straps, testify to this cabinet. There with the closet

13 24 Engineering research 1/2018 research 1 / the original access is blocked in the east wall, a new one must be created in the north wall. Instead of the older ceiling construction, a completely new wooden beam ceiling will be installed. Neither the ceiling beams nor the floorboards of the older ceiling are reused, but the wood required between 1242 and 1244 is felled and built. Soon after installation, this ceiling is completely colored and decorated with plant and tendril motifs as well as flowers. Six noticeably large decorative nails will be hammered into the new ceiling in the east-west axis of the room, opposite the cupboard in the east wall. The position on the side surfaces of the beams prevents anything from being hung between the nails; rather, it is a matter of individual suspensions, probably for traffic lights.An ogival niche is used in the north wall for another light source; a surrounding fold shows that the niche could certainly be closed with a grille. Obviously, an oil lamp often hung here unsupervised, as the oil spilled on the wall shows. There are similar traces of lamp oil below horizontal prints on the north and south walls of the room. Apparently there were shelves with lights on them. Also in 1244 the northern of the two window niches to the platea iudeorum was broken down to the ground level of the upper floor. This enlargement of the niche goes through the whole wall and leads to a (no longer present) bay window on corbels, which are still clearly visible in the outer wall. These findings, cupboard in the east wall, elaborate ceiling painting, hanging of lights in the axis of the cupboard, (unattended) light in a light niche, a bay window on the street, lights on wall shelves suggest that a private Jewish prayer room was set up here in 1244. The strongest indication is the cupboard in the east wall: cupboards were not part of the common furniture in the middle of the 13th century, instead all things of everyday life were kept in chests, merchandise in barrels or sacks. We know cabinets from monasteries and as reliquary cabinets and as Torah shrines for storing Torah scrolls. For the initiator of the renovation, it was obviously not only important to set up a cabinet, but also to position it in the middle in front of the east wall. There would undoubtedly have been enough space in the room to set up a cupboard: next to the gate in the east wall, on the north and south walls or on the west wall between the windows. The determined will to set up the cupboard in the east of all places can only be explained by its function as a Torah shrine, which stands on the wall facing Jerusalem. Other findings also point to a Jewish prayer room: It can be assumed that lights could be hung on the decorative nails on the decorative nails in the beams opposite the Torah cabinet (see illustration on the right). Photo: Nataliya Koev Photo: Barbara Perlich Reconstructed prayer room with Torah cupboard (right), traffic lights above a bima and exit bay to the street (left) as well as a light niche in the north wall. opposite the Torah shrine there were six traffic lights that illuminated the Torah scroll resting on a desk during the reading. The lights placed on the shelves along the north and south walls may have provided additional light for reading and studying. In the lockable light niche, an oil lamp presumably provided permanent light on the Sabbath, for example, with a grille serving as protection for the lamp that was unattended at night. It is possible that this light can even be understood as Ner Tamid, as Eternal Light in the memory of the menorah set up in the temple. There was initially no explanation for the exit bay for the platea iudeorum, neither in a Jewish nor in a non-Jewish context. A lavatory dungeon can be ruled out, but obviously it was important to be able to step out of the room. We can reconstruct the 13th century buildings opposite the bower quite well: It was no lower than the neo-Gothic town hall standing here today, and was even significantly closer to the bower. From the rather small late Romanesque windows of our room it was not possible to see the sky: A small bay window could have served to step out of the room. The fact that such a bay window was built during the renovation phase in 1244 can possibly be explained by the fact that Jewish users wanted to see the sky in order to look at the first three stars in the evening sky, which indicate the beginning and end of the Sabbath. The painting of the ceiling only indirectly supports the thesis of the Jewish prayer room. The chosen motifs are not specifically Jewish, there are no Hebrew characters or the like. However, the decorative design shows that this room was given a certain representative significance, as we know it from private chapels in patrician houses, for example. The restriction to plant motifs and the omission of the depiction of animals, people and mythical creatures as in the next examples of painted ceilings in urban buildings in the early 14th century do not necessarily speak in favor of a Jewish prayer room, but would go hand in hand with this use. We know from written references and testimonials that there were private Jewish rooms in the Middle Ages. As the only known private Jewish prayer room that has survived in rem, the Erfurt findings are now of outstanding importance for the knowledge of everyday Jewish piety in the Middle Ages. Dr.-Ing. habil. Barbara Perlich, Department of Architectural and Urban History at TU Berlin, is project manager and Julia Hurlbeck, restorer M.A., project collaborator at the University of Applied Sciences Erfurt in the department of conservation and restoration. DFG funding since 2015 in individual funding. Address: Institute for Architecture of the TU Berlin / Department of Building and Urban History, Straße des 17. Juni 152, Berlin wall painting and architecture version / steinsaal-erfurtsteinernes-haus /? Tx_wtgallery_ pi1% 5bshow% 5d = Graphics: Barbara Perlich / Julia Hurlbeck