HYLE--International Journal for Philosophy of Chemistry, Vol. 10, No.2 (2004), pp. 153-168.
http://www.hyle.org
Copyright © 2004 by HYLE and Martin Meyer & Osmo Kuusi


Special Issue on "Nanotech Challenges"



Nanotechnology: Generalizations in an Interdisciplinary Field of Science and Technology

Martin Meyer & Osmo Kuusi*

  

Abstract: This paper reports on work-in-progress in the area of technology generalization. More specifically, it presents a model that allows integrating various expectations regarding emerging technologies. Nanotechnology is used as an example of a novel field of science and technology. The notion of leitbild (‘guiding image’) is used as a mediating concept pointing to potentially emerging technologies. Then we discuss to what extent patent and publication data can facilitate identifying scientific and technological trends and how to evaluate the epistemic utility of a leitbild.

Keywords: nanotechnology, technology generalizations, leitbild systems, foresight, Delphi.


 

1. Introduction [1]

The Kuhnian notion of ‘paradigm’ is commonplace nowadays. Dosi first introduced that notion in technology studies. He assumed that ‘normal’ technological change consists of incremental, relatively small improvements that follow bigger, revolutionary (and therefore ‘scarce’) technological breakthroughs which ultimately result in new technological paradigms. According to Dosi (1982, p. 152) a technological paradigm "embodies strong prescription on the directions of technical change to pursue and those to neglect". Dosi (1988) defined a technological paradigm as a

model and pattern of solution of selected technological problems, based on highly selected principles from natural sciences, jointly with specific rules aimed at acquiring new knowledge […] A technological paradigm is both an exemplar – an artifact that is to be developed and improved – and a set of heuristics.

Since Dosi, the notion of ‘technological paradigm’ has been used by so many researchers that even this concept has become a commonplace. Substantial qualitative and theoretical work is available on the emergence of a new technological paradigm. Debackere and Rappa (1994) have suggested that technological paradigms typically emerge in two phases: bootlegging and bandwagon.

During the bootlegging period, which may last for a long time, a small number of researchers dedicate themselves to furthering the field. Their peers may not share their enthusiasm. Frequently, researchers from such an emerging community have to face severe criticism. Typically, they have difficulties in securing adequate funding, hence, the term ‘bootlegging’. Typically, a few isolated individuals start working on similar problems with roughly similar ideas (Debackere & Rappa 1994, pp. 27-28).

Researchers who are dedicated to a new and unorthodox field of inquiry often face a difficult dilemma. On the one hand, before receiving resources, they need more proof that their work will yield results. On the other hand, without resources, they are unable to precisely do that.

‘Bootlegging’ enables fledging research to proceed without the full knowledge and scrutiny of managers and other researchers, up to a point at which the promise of the idea is clear. During this phase then, the community will be highly concentrated among a small number of organizations, and the yearly increase in number of researchers is fairly moderate (ibid.).

As the number of individuals working on the same problem area increases, a communication network emerges with ties that are much stronger than the ties binding the individuals to the organizations they formally belong to. During this 2nd, so-called bandwagon phase of the community life cycle, a very rapid increase occurs in the number of researchers working in the community, with this taking place over a relative short period of time.

As the community grows, a new paradigm comes into being, indicated by the higher-level network of the (sub-)discipline as competing with the older paradigm. The community tries to organize congresses and found journals, so as to be able to steer the selection process. The R&D community is typically distributed across organizations, sectors, and countries. If the work of a new community seems interesting from a commercial point of view, some scientists may be recruited by enterprises, while some who already work within industry are allowed to devote their efforts openly to the new field. Finally, some scientists may decide to become entrepreneurs themselves.

In terms familiar to the field of futures studies, one can compare the new paradigm in the bootlegging stage with a weak signal that only few take seriously. In the bandwagon stage it develops towards a strong signal that has to be taken into account.

This paper presents an overview of our theoretical work regarding leitbilds. After introducing the basic concepts we apply our heuristics to nanotechnology. Drawing on a number of technical reports on developments in nanoscience and technology we try to characterize the leitbild system of nanotechnology. We discuss the potential use of patent and publication based data to generate topics within the aforementioned leitbild systems. The paper concludes with a suggested model as to how one can evaluate the epistemic utility of a leitbild.
 
 

2. Technology generalizations and leitbilds

Technology generalizations are different types of perceived similarities between the already existing technological innovation and a potentially new technology. Similarities concern both the techniques applied in the innovation and the targets that are achieved based on the innovation (Kuusi & Meyer 2002). Two techniques are similar in the sense that they could replace each other in the achievement of (defined) targets. Another form of generalization is based on the realized techniques of the innovation that are used for new ‘similar’ applications.

In terms familiar to futures studies, one can compare a new paradigm in the bootlegging stage with a weak signal that only few people take seriously. In the bandwagon stage, it develops towards a strong signal that must be taken into account. A concept that illustrates the guiding function of an emerging technological paradigm is the ‘leitbild’. ‘Leitbild’ is a German word. Its most general meaning is ein Bild, das leitet, a guiding image. According to Marz and Dierkes (1994), a leitbild has two functions, guidance and image. The guidance function consists of three subfunctions: (1) creating a shared overall goal, or ‘collective projection’; (2) orientation toward one long-term overall goal, or ‘synchronous preadaptation’; (3) working in the same direction, or ‘functional equivalency.’ The image function consists of three subfunctions: (1) cognitive activator; (2) providing a focal point, or ‘individual activator’; and (3) ‘interpersonal stabilizer’.

Like a common vision, a leitbild creates a shared overall goal, offers orientation toward one long-term overall goal, and provides a basis for different professions and disciplines to work in the same direction. Leitbild refers not only to a common vision of actors; it also relates to the concept of autopoesis (from Greek, self-organization) and functions as an interpersonal stabilizer. With an efficient leitbild, no center is needed that urges or controls individuals to perform certain functions.

Inspired by Marz and Dierkes (1994), we characterize the general rules of an emerging paradigm as a system of leitbilds. An emerging technological paradigm is typically a system of many competing leitbilds. In the bandwagon (or paradigmatic) stage, one leitbild often begins to dominate. Leitbilds are used in visions, but it is important to distinguish between a ‘leitbild’ and a ‘vision’.[2]

Followers of a leitbild form a kind of ‘intellectual community’, but as long as their visions differ, they usually do not establish a real R&D community. The intellectual community of a leitbild typically integrates several R&D communities and their members.

We use the notion of technological leitbild systems (Kuusi & Meyer 2002) to explore inter-relations and connections between seemingly separate areas, because a leitbild system can establish links through similarities or analogies. A leitbild system is a system of guiding images that create a shared overall goal, offer orientation toward one long-term overall goal, and provide a basis for different professions and disciplines to work into the same direction. Thus, a leitbild system defines the development path of a technological paradigm.

Bijker (1993) has introduced the notion of a ‘technological frame’ that combines the cognitive and the social sphere, including exemplary artifacts, cultural values, goals, scientific theories, and tacit knowledge. A frame is not fixed, but built up and sustained by the process of stabilizing artifacts, and is internal to the set of interactions within a relevant social group. However, actors can be members of more than one frame/social group with different degrees of inclusion in any frame. Above all, a technological frame provides "the goals, the thoughts and the tools for action", whilst at the same time limiting the freedom to act. In this way interactions create a structure that, in turn, constrains further interactions (Bijker 1993, Martin 1998).

Bijker’s concept of a ‘technological frame’ is quite close to our understanding of leitbilds. It is an important step toward notions of technological trajectories, which are more closely related to concepts of technological determinism. However, the notion of ‘technological frames’ does not give technology the prominent role it deserves. Here, our leitbild concept steps in. Our concept appreciates both the importance of social factors that influence the exploration of technological options and the technological determinants that confine the relevant cognitive processes to certain research, development, and design spaces.
 
 

3. Types of technological generalizations in a technological paradigm

In this section we discuss how for an emerging technological paradigm, future applications can be anticipated. Kuusi suggests that a technological paradigm is a "shared generalization language" capable of producing important generalizations (Kuusi 1999). These generalizations are based on a cluster of linked technologies. The language of a promising technological paradigm can be viewed as a cluster consisting of realized and promising targets and realized and promising techniques. Realized targets are existing artifacts – or, more precisely, their properties or functions – while realized techniques are production processes and design methods. The similarity between techniques is based on the perceptions and interpretations of experts in the corresponding field, whereas the similarities between targets are based on perceptions and interpretations of the users of the artifacts.

The underlying idea of the generalization concept is that existing techniques and targets serve as a platform for a process generating technological options in a multitude of ways. Generalizations are always based on perceived similarities. Emerging paradigms provide similarities based on both realized targets and realized techniques. On the other hand, a technological paradigm is the result of this type of generalization process, its successes and failures. Realized targets, which have been achieved with realized techniques (‘successful exemplars’), and unsuccessful exemplars are ‘concepts’ of the generalization language.

Figure 1 illustrates six different types of generalization. Realized techniques can be generalized so as to predict promising techniques (arrow 1), if both techniques are considered scientifically similar. From the point of view of the paradigm, there are no fundamental technical problems in Type 1 generalizations. It simply requires some effort. For example, once you have realized that a certain virus can be used to transfer a gene to a bacterium, it is reasonable to believe that you might also use another (similar) virus for that purpose. Another form of generalization is based on already realized techniques that bear a potential beyond their current range of application. Techniques can be used to create new artifacts that are (from the point of view of the paradigm) similar (arrow 2). Like Type 1, this generalization is based on scientific similarity, but only partly. For example, once you have realized that you can transfer a gene to a certain bacterium with a virus, it is reasonable to believe that you might transfer the gene in a similar way to another bacterium. But is the gene transfer to the second bacterium as acceptable to your customer as the first transfer? The targets (or the transfers) in both cases might be very similar from a technical point of view but very different from the point of view of your customer. Your customer might consider that the second transfer is irrelevant or even unethical. It is important to realize that technological paradigms as ‘generalization languages’ are also based on customer values. Actually, we assume in our model that similarities between targets are based only on the interpretations of customers.

Once you have realized a target or made a new artifact using a certain technique, you might start thinking about new ways to produce the artifact or new techniques to improve it (arrow 3). This is a new line for technological generalizations, or for enriching the ‘paradigmatic language’. You might eventually include in your paradigm new techniques that have technically very little to do with your original techniques. Consider fusion energy. The original technical idea of the fusion bomb has very little in common with the recent techniques based on the use of huge magnets.

Figure 1. Different types of technological generalizations.

Generalizations of Types 1, 2, and 3 are relatively well grounded. It is possible, however, in the language of a paradigm to make generalizations that are far less grounded. Instead of strong scientific similarities, they are based on possible social developments or on weak scientific similarities (weak scientific or technical signals). One can anticipate techniques that would become promising if somebody first realizes certain targets (arrow 4). For example, if you are able to set up a permanent colony of people on the moon, new efficient ways to produce solar energy on the moon might become possible. Or you might anticipate new targets to be achieved if you could realize a technique that is supported only by weak technical signals (arrow 5). For example, if you can produce energy cheaply, you might provide an abundant supply of fresh water from salt water.

There is still one arrow in our picture left, arrow 6. It means that a person or an organization whose target B has been achieved considers that it is also possible to achieve similar target B’. How successful is this type of generalizations? Frequently such generalizations are irrational and have often resulted in questionable processes. Why are Type 6 generalizations frequently unsuccessful? The important point is that in our model – as well as in reality – the similarity between B and B’ is based only on the interpretation of users of the realized artifact. In all other generalizations, similarity interpretations are either made only by technical experts or by users and technical experts together.

We illustrate our point with an example. Energy users realized in the early 1950s that it is possible to make commercial energy from atomic fission by using similar techniques of the fission bomb. Based on this generalization, many users made a Type 6 technology generalization. They considered that in similar way one could proceed from atomic fusion bombs to commercial fusion energy and provided a considerable amount of funding for the development of the commercial fusion power. Though opportunistic technical experts have used the funding for the development of commercial fusion energy, they were surely aware already in the early 1950s of huge technical difficulties of that project. In order to produce commercial fusion energy, you have to keep the fuel for a relatively long period at extremely high temperature and under equally high pressure. That is not needed in the production of energy from atomic fission. If all the money that has been used for the development of commercial fusion energy would be have been used, e.g., on solar power, the energy situation of humankind might be much better.
 
 

4. Application of the model to Nanotechnology

Now, we apply our approach introduced in the previous sections to the field of nanotechnology. The term ‘nanotechnology’ was first coined by Norio Taniguchi in the 1970s in Japan where it is associated with top-down miniaturization "which can be regarded as the latest stage in mechanical engineering, which has pursued ever-tighter precision of manufacture and tolerances throughout its history" (Budworth 1996, p. 13). In the 1980s, Drexler began to use the term nanotechnology to denote his vision of molecular manufacturing (Drexler et al. 1991, p. 294). The main difference between leitbild and vision is that ‘vision’ is an actor related concept in the framework of visionary management. Persons or organizations might have visions that give them the ability to plan or set policy in a far-sighted way. Leitbild is not related to any specific actor. It is a principle that can be selected as a part of a vision.

According to Grupp (1993, p. 65), "nanotechnology will have a key position in the technological development of the 1990s and in the first decades of the 21st century". He described the field as an enabling technology that "makes possible engineering at the level of atoms and molecules" and continues:

This new basic technology can stimulate future innovation processes and new generations of technologies. It is based on the interaction of information technology, polymer research, optics, biochemistry and medicine and micromechanics.

Grupp’s characterization of nanotechnology indicates the early-stage character of the field, but also shows the potential it holds. His description further underlines the interdisciplinary and cross-boundary nature of the area, which provides a substantial challenge to what is perceived as necessary collaboration between sectors and disciplines. Considerable efforts from various sides have been undertaken to forecast the development of this novel field of science and technology. For instance, the German Mini-Delphi study chose nanotechnology as an explicit category.

Table 1. Nanotechnology topics in the German mini-Delphi study (adapted from BMBF 1996)

Leitbild

Topic

Realized

I

20. An analytical method that sorts out a particular type of atoms using high-definition surface-analysis techniques will be in practical use.

2001–05

I

22. Reaction and synthesis methods at individual atoms or molecules of, respectively, atomic or molecular level of magnitude will be in use applying techniques from scanning tunneling microscopy.

2006–10

II

16. Methods to synthesize substances with new functions (e.g., polymer crystals with weak bonds) will be developed by way of combining various types of bonds at the atomic level.

2006–10

II

17. Nanostructured materials with predetermined properties will be manufactured.

2001–05

III

14. Functional materials and/or semiconductor components whose compositions and dotting densities vary from atomic layer to layer are widely used.

2006–10

III

18. Organic hybrid composite materials that are based on the control of monomolecular layers will be developed.

2006–10

IV

19. Organic–inorganic composite materials will be developed (e.g., biomimetically) whose elements are at the level between several and a few dozen nanometers.

2001–10

IV

B. Organic, molecular composed materials will be developed using the natural method of self-organization

2006–10

V

15. Electronic solid-state components that consist of ‘super atoms’ of artificially composed atoms will be developed.

2006–10

V

21. ‘Atomic function elements’ (atomic switches, atom relay transistor, etc., in which movements of a small number of atoms cause logical and/or storage functions) will be in practical use and have a higher reliability and processing velocity than solid-state components.

2011–15

Table 1 contains a number of Delphi topics that can be used as examples and which represent the nanotechnology section. We have rearranged the topics according to various leitbild types and analyze them according to our five types of generalization.
 

Leitbild I (‘Nano-resolution tools’): Generalizing from realized to promising techniques (Type 1)

Nano-resolution analytical methods as depicted in topics 20 and 22 can be viewed as generalizations of Type 1 – from already realized techniques to other promising techniques. The aim here is to further improve existing tools, typically in an incremental fashion, by adding new functions to analysis tools. In our example, realized techniques, such as atomic force microscopes (AFM’s) or scanning-tunneling microscopes (STM’s), are further generalized into promising tools that are not yet developed but conceivable from the already existing technological platforms. Further, very incremental developments of scanning force microscopes can be expected to improve the reaction and synthesis methods or chemical analysis.

Along with further technical development of scanning-probe methods, researchers are discovering new phenomena in the fields of physics, chemistry, and biology. At the same time these microscopy techniques are increasingly used as a ‘tool’ rather than a ‘probe’. The idea is to modify surfaces and tailor their structures on the nano-scale, down to the manipulation of individual atoms (Frenken, 1998, pp. 289-299). Ultimately they might facilitate large-scale manipulation at the nanometer level. However, this transcends the possibility of Type 1 generalizations (see leitbild V below).
 

Leitbild II (‘Nanomaterials’): Generalizing from realized techniques to promising targets (Type 2)

Nanomaterials are an area that is characterized by Type 2 generalization, the transition from realized techniques to promising targets. Together with a better scientific understanding of the subject matter, a variety of already realized techniques allow developing rather specific ideas of improved materials. By taking advantage of nanoscale characteristics of structures and substances, one may create new materials with enhanced properties, such as polymers, composites, or other materials (topics 16 & 17). Rather than direct control of individual atoms, bulk operations suffice to exploit these nanoscale properties.

Another example of bulk-processing nanomaterials are colloidal dispersions (Philipse 1998, pp. 171-8). Colloid science deals with the physics and chemistry of finely dispersed particles with at least one dimension in the submicron range, including nanoparticles that are frequently considered smaller than 100 nm. Colloid science has a long tradition involving nanoparticles such that not all that is nano is necessarily new. In this sense, colloids encompass gold colloids, colloidal silica, and aluminum oxide powders. Due to their small dimensions, colloids exhibit Brownian motion. Owing to their large surface area, the interaction between colloidal particles in the liquid phase is determined by surface forces, such as Van der Waals attractions, and repulsions due to the particle charge. The balance between these forces critically depends on the details of the particle surface and the liquid composition. Colloids easily aggregate to form large aggregates, networks, or gels. While there are already techniques to control these aggregation processes to some extent, our understanding remains limited. Yet we know enough of the existing techniques and about potential ways to improve them to envisage also improved properties of materials and, ultimately, products, such as milk, cosmetics like toothpaste or sunscreen, or ink, which are nothing but suspensions of colloids or dispersions. Computer simulation and statistical mechanics are tools that are used to further understand colloidal systems.
 

Leitbild III (‘Ultra-thin Films’): Generalizing from promising targets to promising techniques (Type 4)

Thin-film techniques are an example of Type 4 generalization from promising targets to promising techniques. Realized techniques already permit sufficiently exact operations at the nanometer level to suggest the idea of future products that would require even more exact and precise tools. This generalization requires a preceding Type 2 generalization. Thin-film technologies are a considerably well-developed field. The ultra-fine production of thin films is necessary for the subsequent characterization. Designing ultra-thin layers is associated with a number of aims, such as atomically exact delineations of layers, quantized potential distribution, defined pore distribution in layers, ultra-thin separation and protection layers, and improved layer function by way of multilayer structuring. These targets are in turn motivated by and related to many technical applications, including information storage layers, films with quantum effects, optical layers, multilayer piles for semiconductor laser and X-ray optical compounds, displays, sensor layers, tribologic films, biocompatible films, photovoltaic films, membrane films, and chemically active surfaces (Bachmann 1998), which are the starting point for Type 4 generalizations toward new, improved techniques.

Two topics in our Delphi example correspond to this type of generalization (topics 14 & 18). Here efforts appear to be directed at characterizing these structures. Topic 14, for instance, suggests that the control of monomolecular layers will allow developing organic hybrid composite materials. The aim of controlling monomolecular layers, while not yet possible, is based on the progress made with existing tools and techniques that allow speculating about the properties of new products or processes, which in turn leads to the next step towards improved instruments.
 

Leitbild IV (‘Biomimetics’): Generalizing from realized targets to promising techniques (Type 3)

The topics in the area of biomimetics (19, B) are examples of Type 3 generalization from realized targets to promising techniques. The idea is to simulate nature in order to develop materials with novel properties by way of self-organization. The biomimetic approach can be used as a path to obtaining novel materials, using self-assembly techniques to make organic templates on which inorganic structures are then deposited (Budworth 1996, p. 7).

While basic principles of self-organization are known, we still need to integrate various techniques to achieve the target of controlled self-assembly. Although one can create structures by way of self-organization in a biomimetic process, our technological means are still incomplete to fully utilize the potential this leitbild offers. Being aware of the general feasibility – thanks to already realized artifacts – we can make reasonable assumptions about the requirements of the techniques necessary to pursue this path of development further.
 

Leitbild V (‘Direct control of atoms’): Generalizing from promising techniques to promising targets (Type 5)

Topics 15, 21, and 23 in the Delphi study describe a leitbild that focuses on the direct control of atoms in order to rearrange them to form new structures that could result in novel materials. This leitbild follows a Type 5 generalization, from promising techniques to promising targets. Building on Type 1 generalization, it is first based on the availability of promising techniques from which promising targets are then projected. As pointed out in leitbild I, we can reasonably expect current STM and AFM technologies to be further developed into more complex tools that, beyond measurement and observation, can efficiently manipulate structures at the nanometer scale. From such promising technique one can make the Type 5 generalization step to improved and novel artifacts.

The difference between the materials approach, leitbild II, and leitbild V is the different control of processes, bulk reactions versus atomic control. Atomic control is also strongly related to the idea of atoms being effectively used as carrier of certain functions, such as data storage, etc.
 
 

5. The leitbild system of nanotechnology

All the different approaches we call leitbilds belong to one greater whole that eventually will develop into a technological system. As long as the exact shape of that technological system is unclear, we speak of a leitbild system instead. One element of this leitbild system might even substitute and outdate another leitbild. For instance, what we identified as leitbild V could replace II one day. Even though both approaches refer to nanostructures, they are essentially different. While II uses bulk methods, V aims at direct atomic control.

A leitbild and, even more so, a leitbild system is coined by the integration of a number of communities. Even though leibild II is a field that is relatively close to realization, it still critically relies on the integration of knowledge from a variety of disciplines and of expertise from a number of industrial sectors. For instance, even for monitoring and controlling activities at the bulk level, it is necessary to use nano-resolution instruments. The borderlines between science and engineering disciplines become blurred, and disciplinary fields tend to fuse as in the field of materials science and engineering. This is even more apparent in the area of biomimetics, which tries to simulate natural principles to build up structures. At the nanometer level, the boundaries between disciplines tend to disappear.

This is why we can refer to nanotechnology as a leitbild system that integrates different approaches, each of which being autonomous enough to bear its own identity, but also depending to a greater or lesser extent on results from the other fields.
 
 

6. How to promote technological generalizations related to emerging leitbilds?

People committed to different leitbilds considerably differ in their evaluations of the future prospects of generic technologies. How can we make different evaluations/interpretations more explicit? Kuusi (1999) has suggested that that we can handle the difference by measuring the epistemic utility. The idea is that for an actor it is more reasonable to start a realization process of a certain option, if the epistemic utility of that option increases.

In the bootlegging stage of a leitbild, there are only few actors who believe in the reasonability of the underlying generalizations. Most experts think that the generalizations will not be realized at all or that it takes too long before it is reasonable to start the realization process. If the leitbild has proceeded to the bandwagon stage, a majority of actors believe in rather quick realization of the generalizations. The epistemic utility of the topic has increased dramatically for average actors. Any new successful generalization of the emerging technology presented during the process between the bootlegging stage and the bandwagon stage has some impact on this growth of the epistemic utility.

In this paper, we will not discuss how to measure epistemic utility (see Kuusi 1999). It is sufficient to mention four aspects of the epistemic utility of a technological generalization. The epistemic utility is related, first, to the anticipated impacts of the generalization; second, to the value (positive or negative relevance) given by relevant stakeholders to different impacts; and, third, to the techniques available for the realization of the generalization. Typically a champion of a technological generalization has in the bootlegging stage much more positive evaluation concerning these aspects than mainstream actors. The important fourth aspect is the evaluated validity of the three anticipated aspects.

National technology foresight Delphi studies have had ‘proxy’ measures for the variables of the four aspects of the epistemic utility. The degree of the importance of each topic has been measured by the Delphi panelists’ evaluations (Cuhls & Kuwahara 1994, NISTEP 2001), which refer to our first two aspects: the impacts and their relevance. The evaluation scales exclude topics being evaluated feasible but undesirable, which implies the questionable assumption that the realization of topics is always desirable, though more or less important.

In the latest Japanese Technology Foresight study, the impacts are also discussed with expected effects and potential problems of technology generalizations (NISTEP 2001). Evaluated effects are socio-economic development, resolution of global problems, people’s needs, and expansion of intellectual resources; potential problems are adverse effect on the natural environment, on safety, and on morals/culture/society.

Proxy measures for feasibility are the anticipated cost constraint as well as technical, funding, human resources, and R&D system constraints on technological generalizations (Cuhls & Kuwahara 1994). Two proxy measures for the validity of an evaluation are the degree of certainty of an expert concerning the realization time of a topic and the self-evaluation of the expertise (Loveridge et al. 1995, NISTEP 2001).

Evaluations of the epistemic utility of technological generalizations also provide a heuristics for the decision making of a company. Let us suppose that a company includes only one champion of a technology generalization based on an emerging paradigm who considers starting the realization project a reasonable choice, which means that only for him or her the epistemic utility sufficiently high. The managers of that corporation could base their decision in favor of the project on two reasonable necessary conditions: (1) the champion is a reasonable person; and (2) the champion is ready to take an economic risk with this project. If these two conditions are met, a reasonable choice for the firm would be to start a new venture with the champion. This strategy has been empirically found e.g. by Lovio (1993) in the Finnish electronic industry in the 1980s. Another reasonable policy is to allow the champion to continue the bootlegging as long as the epistemic utility is growing both for the champion and other key persons in the company. This means that the champion has to produce new arguments (e.g. realized minor generalizations) which step by step convince new protagonists.
 
 

7. Outlook

With respect to forthcoming research activities, we approached the question as to how to generate candidates for leitbilds from data on the current research and technology. In the early 1990s, patent data was used in mid-term oriented Foresight activities (Grupp 1993). With respect to nanotechnology, more recent work was carried out by Meyer et al. (2002).

Using bibliometric techniques with patent and publication data allows filtering and identifying core concepts that emerge in a specific area.[3] Mapping an area over time can illustrate when new concepts have emerged and may allow speculation on what new technological steps can be expected. Using elements of our leitbilds, experts may be able to identify clusters of techniques that would allow addressing some promising targets or conversely could speculate on how nanoscale techniques currently under development could be extended in their area of application.

However, keyword maps are typically limited to a set of the top 60 or so concepts that occur most frequently and are therefore by default fairly general in nature. Instead of focusing on the top 60 concepts, we plan to investigate a subset of nanotechnology areas (nanobiotechnology, nano-structured materials and surface characterization) to generate a set of more specific concepts from which experts could generate topics suitable for a Delphi study. We assume to find candidates for different leitbilds by applying cluster analysis to second order concepts in the patent applications (e.g. ranks 100-200).

Another application of bibliometric techniques would be the identification of potential experts, based on mostly cited or linked documents in the leitbild system candidates. Interviews with these experts may allow further analysis of their key technology generalizations and leitbilds.
 
 

Notes

[1]  This paper was presented at the ‘Workshop on Expectations in Science and Technology’, Risoe, Denmark, April 29-30, 2004. We thank participants and referees for their helpful comments.

[2]  The main difference is that the ‘vision’ in the framework of visionary management is an actor-related concept. Persons or organizations might have visions that give them the ability to plan or make policy in a farsighted way. A leitbild is not related to any specific actor. It is a principle that can be selected as a part of a vision; e.g., a firm might select ‘the sample principle of digital technology’ (a leitbild) as a part of its vision.

[3]  For an illustration, see the maps of the most frequently co-occurring keywords in Meyer et al. 2002.
 
 

References

Bachmann, G.: 1998, Innovationsschub aus dem Nanokosmos. Technologieanalyse, VDI Technologiezentrum, Düsseldorf, section 2.3.2.

Bijker, W.E.: 1993, ‘Do not despair – There is life after constructivism’, Science, Technology and Human Values, 18, 113-38.

BMBF: 1996, Delphi-Bericht 1995 zur Entwicklung von Wissenschaft und Technik-Mini-Delphi, BMBF, Bonn.

Budworth, D.W.: 1996, Overview of activities on nanotechnology and related technologies’ (Report on a study for the IPTS-JRC, Seville), p. 7.

Cuhls, K.; Kuwahara, T.: 1994, Outlook for Japanese and German Future Technology, Physica, Heidelberg.

Debackere, K.; Rappa, M.: 1994, Science and industry: network theory and paradigms’, Technology Analysis & Strategic Management, 6(1), 21–37

Dosi, G.: 1982, ‘Technical paradigms and technological trajectories’, Research Policy, 11(3), 147–162.

Dosi, G.; Freeman, C.; Nelson, R.; Silverberg, G.; Soete, L. (eds.): 1988, Technical Change and Economic Theory, Pinter Publishers.

Drexler, K.E.: 1991, Nanosystems: Molecular machinery, manufacturing, and computation, John Wiley, New York.

Frenken, J.W.M.: 1998, ‘Scanning Tunneling Microscopy’, in: A. ten Wolde (ed.), Nanotechnology: Towards a molecular construction kit (STT Report 60), The Hague, pp. 289-299.

Grupp, H. (ed.): 1993, Technologie am Beginn des 21. Jahrhunderts, Physica-Verlag, Heidelberg.

Kuusi, O.; Meyer, M.: 2002, ‘Technological generalizations and leitbilder – the anticipation of technological opportunities’, Technological Forecasting & Social Change, 69, 625–639.

Kuusi, O.: 1999, Expertise in the Future Use of Generic Technologies, Government Institute for Economic Research (VATT), Helsinki, 1999 (p. B 59).

Loveridge, D., Georghiou, L.; Nedeva, M.: 1995, United Kingdom Technology Foresight Programme. Delphi Survey, PREST, University of Manchester (p. 543).

Lovio, R.: 1993, ‘Evolution of Firm Communities in New Industries: The Case of o the Finnish Electronics Industry’, Acta Universitatis Oeconomicae Helsingiensis, A-92.

Martin, P.A.: 1998, From eugenics to therapeutics: science and the social shaping of gene therapy, (D.Phil. Thesis), University of Sussex, Brighton.

Marz, L.; Dierkes, M.: 1994, ‘Leitbildprägung und Leitbildgestaltung’, in: G. Bechmann, T. Petermann (eds.), Interdisziplinäre Technikforschung: Genese, Folgen, Diskurs, Campus, Frankfurt.

Meyer, M., Persson, O.; Power, Y.; et al.: 2002, Mapping excellence in nanotechnologies. Preperatory Study, European Commission, DG-Research, Brussels [http://europa.eu.int/comm/research/era/pdf/nanoexpertgroupreport.pdf].

NISTEP: 1997, The Sixth Technology Forecast Survey, Future Technology in Japan Toward the Year 2025, National Institute of Science and Technology Policy (NISTEP), Report No. 52.

Philipse, A.P.: 1998, ‘Colloidal Dispersions’, in: A. ten Wolde (ed.), Nanotechnology: Towards a molecular construction kit (STT Report 60), The Hague, pp. 171-8.


Martin Meyer:
SPRU, University of Sussex, Freeman Centre, Brighton BN1 9QE, UK; m.s.meyer@sussex.ac.uk
Osmo Kuusi:
VATT Government Institute for Economic Research, Helsinki, Finland; osmo.kuusi@vatt.fi


Copyright © 2004 by HYLE and Martin Meyer & Osmo Kuusi