Refine
Document Type
- Book chapter (23) (remove)
Is part of the Bibliography
- yes (23)
Institute
- Technik (23) (remove)
Publisher
- Springer (12)
- Elsevier (4)
- Hanser (2)
- CRC Press (1)
- Engineering Science Reference (1)
- Hochschule Ulm (1)
- Public Verlagsgesellschaft und Anzeigenagentur (1)
- Wissenschaftliche Verlagsgesellschaft (1)
Wissen schaffen für klimafreundliche Energietechnik : das Erfolgsgeheimnis der Kraft-Wärme-Kopplung
(2017)
Als hocheffiziente Energieerzeugungstechnologie spielt die KWK eine wichtige Rolle bei der Energiewende und dem Klimaschutz. Gerade für den Gebäudebereich und Unternehmen kann die KWK eine wirtschaftliche und umweltschonende Möglichkeit sein, Strom und Wärme zu erzeugen. Für die Planung, Umsetzung und den Betrieb von KWK-Anlagen werden fachkundige Handwerke und Ingenieure gebraucht. Aus diesem Grund hat das Ministerium für Umwelt, Klima und Energiewirtschaft in den Jahren 2015 und 2016 zusammen mit dem Handwerkstag Baden-Württemberg ... den Qualifizierungskurs "Kraft-Wärme-Kopplung - Kompetenz für den Wärme- und Energiemarkt von heute und morgen" durchgeführt... Aufgrund des Erfolges der Seminarreihe wird diese auch 2017 ff. fortgesetzt.
Bei der Zerspanung mit geometrisch definierter Schneide (z.B. Drehen, Fräsen, Bohren, Reiben, Sägen, Hobeln, Stoßen, Räumen) werden Zerspanwerkzeuge mit einer definierten Schneidengeometrie verwendet. Die Werte der einzelnen geometrischen Maße basieren auf Richt- und Erfahrungswerten. Die Definition der einzelnen Geometriemerkmale (z.B. Winkel) sind in DIN 6581 enthalten. In den letzten Jahren wurden zur Ermittlung des Prozessverhaltens unterschiedlicher Geometrieparameter Forschungsprojekte durchgeführt, die die Einflüsse der Schneidengestaltung untersuchen (z.B. Zabel 2010). Die Schneidengeometrie wird in der Regel mit den Verfahren Schleifen, Erodieren oder Laserbearbeitung erzeugt. Die Werkzeuge werden auf Universal- oder Spezial(werkzeugschleif)maschinen hergestellt und aufbereitet. Die Ausstattung und der Automatisierungsgrad richtet sich nach den zu bearbeitenden Merkmalen der Werkzeuge und deren Häufigkeit.
Instead of waiting for and constantly adapting to details of political interventions, utilities need to focus on their environment from a holistic perspective. The unique position of the company - be it a local utility, a bigger player, or an international utility specializing in specitic segments - has to be the basis of goals and strategies. But without consistent translation of these goals and strategies into processes, structures, and company culture, a strategy remains pure theory. Companies need to engage in a continuing learning process. This means being willing to pass on strategies, to slow down or speed up, to work from a different angle etc.
Induced by a societal decision to phase out conventional energy production - the so-called Energiewende (energy transition) - the rise of distributed generation acts as a game changer within the German energy market. The share of electricity produced from renewable resources increased to 31,6% in 2015 (UBA, 2016) with a targeted share of renewable resources in the electricity mix of 55%-60% in 2035 (RAP, 2015), opening perspectives for new products and services. Moreover, the rapidly increasing degree of digitization enables innovative and disruptive business models in niches at the grid's edge that might be the winners of the future. It also stimulates the market entry of newcomers and competitors from other sectors, such as IT or telecommunication, challenging the incumbent utilities. For example, virtual and decentral market places for energy are emerging; a trend that is likely to speed up considerably by blockchain technology, if the regulatory environment is adjusted accordingly. Consequently, the energy business is turned upside down, with customers now being at the wheel. For instance, more than one-third of the renewable production capacities are owned by private persons (Trendsearch, 2013). Therefore, the objective of this chapter is to examine private energy consumer and prosumer segments and their needs to derive business models for the various decentralized energy technologies and services. Subsequently, success factors for dealing with the changing market environment and consequences of the potentially disruptive developments for the market structure are evaluated.
Virtual prototyping of integrated mixed-signal smart sensor systems requires high-performance co-simulation of analog frontend circuitry with complex digital controller hardware and embedded real-time software. We use SystemC/TLM 2.0 in conjunction with a cycle-count accurate temporal decoupling approach (TD) to simulate digital components and firmware code execution at high speed while preserving clock-cycle accuracy and, thus, real-time behavior at time quantum boundaries. Optimal time quanta ensuring real-time capability can be calculated and set automatically during simulation if the simulation engine has access to exact timing information about upcoming inter-process communication events. These methods fail in the case of non-deterministic, asynchronous events, resulting in potentially invalid simulation results. In this paper, we propose an extension to the case of asynchronous events generated by blackbox sources from which a priori event timing information is not available, such as coupled analog simulators or hardware in the loop. Additional event processing latency or rollback effort caused by temporal decoupling is minimized by calculating optimal time quanta dynamically in a SystemC model using a linear prediction scheme. We analyze the theoretical performance of the presented predictive temporal decoupling approach (PTD) by deriving a cost model that expresses the expected simulation effort in terms of key parameters such as time quantum size and CPU time per simulation cycle. For an exemplary smart-sensor system model, we show that quasi-periodic events that trigger activities in TD processes are handled accurately after the predictor has settled.
We discuss the fabrication technologies for IC chips in this chapter. We will focus on the main process steps and especially on those aspects that are of particular importance for understanding how they affect, and in some cases drive, the layout of ICs. All our analyses in this chapter will be for silicon as the base material; the principles and understanding gained can be applied to other substrates as well. Following a brief introduction to the fundamentals of IC fabrication (Sect. 2.1) and the base material used in it, namely silicon (Sect. 2.2), we discuss the photolithography process deployed for all structuring work in Sect. 2.3. We will then present in Sect. 2.4 some theoretical opening remarks on typical phenomena encountered in IC fabrication. Knowledge of these phenomena is very useful for understanding the process steps we cover in Sects. 2.5–2.8. We examine a simple exemplar process in Sect. 2.9 and observe how a field-effect transistor (FET) – the most important device in modern integrated circuits—is created. To drive the key points home, we provide a review of each topic at the end of every section from the point of view of layout design by discussing relevant physical design aspects.
Despite 30 years of Electronic Design Automation, analog IC layouts are still handcrafted in a laborious fashion today due to the complex challenge of considering all relevant design constraints. This paper presents Self-organized Wiring and Arrangement of Responsive Modules (SWARM), a novel approach addressing the problem with a multi-agent system: autonomous layout modules interact with each other to evoke the emergence of overall compact arrangements that fit within a given layout zone. SWARM´s unique advantage over conventional optimization-based and procedural approaches is its ability to consider crucial design constraints both explicitly and implicitly. Several given examples show that by inducing a synergistic flow of self-organization, remarkable layout results can emerge from SWARM’s decentralized decision-making model.
Innovative Antriebstechnik muss die aktuellen Anforderungen und die spezifischen Anwenderwünsche mit den verfügbaren technologischen Möglichkeiten in hocheffiziente Lösungen umsetzen. Dazu müssen Elektronik, Software und Mechanik von der Berechnung bis zur Ausführung passgenau integriert und optimiert sein, um auch die heutigen ökonomischen und ökologischen Ansprüche an moderne Antriebe zu erfüllen.
We have seen that bionic optimization can be a powerful tool when applied to problems with non-trivial landscapes of goals and restrictions. This, in turn, led us to a discussion of useful methodologies for applying this optimization to real problems. On the other hand, it must be stated that each optimization is a time consuming process as soon as the problem expands beyond a small number of free parameters related to simple parabolic responses. Bionic optimization is not a quick approach to solving complex questions within short times. In some cases it has the potential to fail entirely, either by sticking to local maxima or by random exploration of the parameter space without finding any promising solutions. The following sections present some remarks on the efficiency and limitations users must be aware of. They aim to increase the knowledge base of using and encountering bionic optimization. But they should not discourage potential users from this promising field of powerful strategies to find good or even the best possible designs.
To prevent high buildings in endangered zones suffering from seismic attack, TMD are applied successfully. In many applications the dampers are placed along the height of the edifice to reduce the damage during the earthquake. The dimensioning of TMD is a multidimensional optimisation problem with many local maxima. To find the absolute best or a very good design, advanced optimisation strategies have to be applied. Bionic optimization proposes different methods to deal with such tasks but requires many repeated studies of the buildings and dampers design. To improve the speed of the analysis, the authors propose a reduced model of the building including the dampers. A series of consecutive generations shows a growing capacity to reduce the impact of an earthquake on the building. The proposals found help to dimension the dampers. A detailed analysis of the building under earthquake loading may yield an efficient design.