Refine
Document Type
- Journal article (1241)
- Conference proceeding (1036)
- Book chapter (381)
- Book (223)
- Doctoral Thesis (54)
- Working Paper (37)
- Anthology (32)
- Report (25)
- Patent / Standard / Guidelines (24)
- Issue of a journal (19)
Is part of the Bibliography
- yes (3081)
Institute
- ESB Business School (1103)
- Informatik (873)
- Technik (509)
- Life Sciences (343)
- Texoversum (219)
- Zentrale Einrichtungen (16)
Publisher
- Springer (346)
- IEEE (250)
- Elsevier (220)
- Hochschule Reutlingen (186)
- MDPI (98)
- Springer Gabler (79)
- Gesellschaft für Informatik (66)
- Universitätsbibliothek Tübingen (59)
- Wiley (54)
- ACM (40)
Im Rahmen der wissenschaftlichen Vertiefung soll auf Basis der vorhandenen Ansätze das IT-Risikomanagement evaluiert werden. Hierbei soll die Frage, inwiefern das IT-Risikomanagement dem Unternehmen eine Hilfestellung bieten kann, geklärt und anschließend anhand von zwei Fallbeispielen dargestellt werden.
Das Ziel dieser Arbeit ist, die Infrastruktur einer modernen Fahrzeug-zu Fahrzeug-Kommunikation auf ihre Sicherheit zu prüfen. Dazu werden die Sicherheitsstandards für die Funkkommunikation genauer beschrieben und anschließend mit möglichen Angriffsmodellen geprüft. Mit dem erläuterten Wissen der VANET Architektur werden verschiedene Angriffe verständlicher. Dadurch werden die Schwachstellen offengelegt und Gegenmaßnahmen an passenden Punkten in der Architektur verdeutlicht.
Durch Industrie 4.0 kann die individuelle Fertigung von kleineren Stückzahlen zu geringen Kosten ermöglicht werden. Dafür müssen alle Anlagen miteinander vernetzt werden, um Daten austauschen und kommunizieren zu können. Durch die Vernetzung können neue Risiken und Gefahren entstehen. In dieser Arbeit wird die ITSicherheit in der Industrie 4.0 anhand möglichen Bedrohungsszenarien, Herausforderungen und Gegenmaßnahmen evaluiert. Dabei wird untersucht, welche Möglichkeiten Industrieunternehmen haben, um Hackerangriffen vorzubeugen und ob bereits etablierte Sicherheitskonzepte für industrielle Anlagen einfach übernommen werden können.
Die Arbeit stellt die Vision des Internet of Things (IoT) vor und betrachtet sowohl Möglichkeiten der Nutzung als auch Gefahrenpotentiale für die Sicherheit der Nutzer. Insbesondere wird hierbei der Anwendungsfall Smart Home näher betrachtet und am Beispiel ZigBee gravierende Schwächen dieser Geräte aufgezeigt.
Background. The application of lean management is standard in many companies all over the world. It is used to continuously optimise existing production processes and to reduce the complexity of administrative processes. Unfortunately, in higher education, the awareness of lean management as a highly effective methodology is quite low.
Research aims. The research aim is to show how the lean strategy can be applied in university environments. Finally, this paper addresses the question why it is so difficult to implement lean in a university environment and how an institution of higher education can move forward towards becoming a lean university.
Methodology. Based on a literature review, five key lean principles are presented and examples of their implementation are discussed using short case studies from our own institution. We also compare our findings with those in the literature.
Key findings. Lean offers the chance to improve the management of higher education institutions. This requires a commitment on the part of the university top management aiming at convincing all stakeholders that a culture of lean helps the institution to be able to adapt to the rapidly changing environment of higher education.
The main challenge when driving heat pumps by PV-electricity is balancing differing electrical and thermal demands. In this article, a heuristic method for optimal operation of a heat pump driven by a maximum share of PV-electricity is presented. For this purpose, the (DHW) are activated in order shift the operation of the heat pump to times of PV-generation. The system under consideration refers to thermal and electrical demands of a single family house. It consists of a heat pump, a thermal energy storage for DHW and of grid connected heating and generation of domestic hot water, the heat pump runs with two different supply temperatures and thereby achieving a maximum overall COP. Within the algorithm for optimization a set of heuristic rules is developed in a way that the operational characteristics of the heat pump in terms of minimum running and stopping times are met as well as the limiting constraints of upper and lower limits of room temperature and energy content of electricity generated, a varying number of heat pump schedules fulfilling the bundary conditions are created. Finally, the schedule offering the maximum on-site utilization of PV-electricity with a minimum number of starts of the heat pump, which serves as secondary condition, is selected. Yearly simulations of this combination have been carried out. Initial results of this method indicate a significant rise in on-site consumption of the PV-electricity and heating demand fulfilment by renewable electricity with no need for a massive TES for the heating system in terms of a big water tank.
"Wer nicht vielfältig denkt, denkt einfältig". Mit diesem Spruch machen Studierende der Hochschule Reutlingen in einem Videoclip auf das Thema Diversity aufmerksam. Doch was verstehen wir eigentlich darunter? Seit September letzten Jahres ist Professorin Dr. Gabriela Tullius Vizepräsidentin der Hochschule Reutlingen und widmet sich unter anderem diesem Bereich. GEA-Campus hat nachgefragt, worum es bei Diversity geht und warum dieser Bereich die Hochschule beschäftigt.
Ereignisse, Aktivitäten und Veränderungen 2005:
- Umstellung auf elektronische Erwerbung
- Umgestaltung des Lesesaals
- Neugestaltung der Homepage
- Erweiterung der Öffnungszeiten um 8 Stunden pro Woche
- Erhöhung der Ausleihzahlen um 7 Prozent
- Einrichtung eines Kopierraums
- Durchführung einer Benutzerumfrage
- Organisation einer Ausstellung anlässlich der 150-Jahr-Feier der Hochschule Reutlingen
Durch die Einnahmen aus Studiengebühren konnten die Erwerbungsausgaben der Bibliothek um 75 Prozent gegenüber dem Vorjahr gesteigert werden. Insgesamt wurden 50 Prozent mehr Bücher gekauft als im Jahr zuvor. Dadurch konnte die Literaturversorgung der Hochschulangehörigen erheblich verbessert werden, was sich an den expandierenden Nutzungszahlen erkennen lässt. Im Jahr 2007 verzeichnete die Bibliotheksstatistik 15 Prozent mehr aktive Nutzer und 15 Prozent mehr Erstausleihen als im Jahr zuvor. Trotz dieser positiven Entwicklungen hat im letzten Jahr die Zahl der Vormerkungen um 20 Prozent zugenommen. Das bedeutet, dass die benötigte Literatur sehr häufig verliehen und bei Bedarf nicht verfügbar ist. Im gleichen Zeitraum stieg die Zahl der Fernleihbestellungen um 26 Prozent. Im Berichtsjahr wurden erstmals 1600 E-Books erworben und im Campusnetz bereit gestellt. In Kooperation mit dem Bibliotheksservicezentrum wurde ein Hochschulschriftenserver eingerichtet, der es den Angehörigen beider Hochschulen ermöglicht, ihre Publikationen elektronisch zu veröffentlichen.
Die Medienausgaben der Hochschule Reutlingen nahmen im Vergleich zum Vorjahr um 18 Prozent zu, wobei sich der Schwerpunkt zunehmend von den Printmedien auf die elektronischen Medien verlagert. Bei der Pädagogischen Hochschule gingen die Ausgaben für Medien um 9 Prozent zurück. Die Nutzungszahlen der Bibliothek stiegen in allen Bereichen: Die Zahl der Erstausleihen nahm um 2 Prozent zu, die Zahl der aktiven Bibliotheksbenutzer stieg um 5 Prozent. Die Nutzung elektronischer Medien legte um 39 Prozent zu. Die Zahl der gebenden Fernleihen stieg um 1 Prozent, die der nehmenden Fernleihen sank um 8 Prozent. Diese Zahlen spiegeln die wachsende Attraktivität des Reutlinger Bibliotheksbestands wider. Im Berichtsjahr wurden zwei Projekte begonnen: Zum einen die Ausstattung des Freihandbestands mit RFID-Etiketten, zum anderen die Installation einer Suchmaschine zur Optimierung der Recherche.
This article reviews the literature on Christmas economics. First, we present an overall picture of the debate on the potential welfare loss of gift-giving and we show strategies that reduce the potential welfare loss and might increase the number of presents received. Second, we discuss the effect of Christmas on prices and the business cycle. We provide evidence that at Christmas stock prices and airfares increase, while food prices decrease.
Several ionic liquids are excellent solvents for cellulose. Starting from that finishing of PET fabrics with cellulose dissolved in ionic liquids like 1-ethyl 3-methyl imidazolium acetate, diethylphosphate and chloride, or the chloride of butyl-methyl imidazolium has been investigated. Finishing has been carried out from solutions of different concentrations, using microcrystalline cellulose or cotton and by employing different cross-linkers. Viscosity of solutions has been investigated for different ionic liquids,concentrations, cellulose sources, linkers and temperatures. Since ionic liquids exhibit no vapor pressure,simple pad-dry-cure processes are excluded. Before drying the ionic liquid has to be removed by a rinsing step. Accordingly rinsing with fresh ionic liquid followed by water or the direct rinsing with waterhave been tested. The amount of cellulose deposited has been investigated by gravimetry, zinc chlorideiodine test as well as reactive dyeing. Results concerning wettability, water up-take, surface resistance,wear-resistance or washing stability are presented.
The sol-gel approach offers a new class of flame retardants with a high potential for textile applications. Pure inorganic sol-gel systems do, however, typically not provide an effect sufficient for a sel-fextinguishing behavior on its own. We therefore employed compounds with nitrogen and phosphorous containing groups. Especially the combination of compounds with both elements, using the synergism, is promising for the aim to find well-applicable, environmental friendly, halogen-free flame retardants. In our approach, the sol-gel network ensured on the one hand the link to the textile as nonflammable binder. On the other hand, the sol-gel-based networks modified with functional groups containing nitrogen groups provided flame retardancy. In this way, a flame retardant finishing for textiles could be obtained by simple finishing techniques as, e.g., padding. Besides a characterization with various flame tests (e.g., according to EN ISO 15025 e protective clothing), we used a combination of cone calorimetry, thermogravimetry coupled with infrared spectroscopy analysis and scanning electron microscopy to analyze the mechanism of flame retardancy. Thus, we could show that the main mechanism is based on the formation of a protection layer. This work provides a model system for sol-gel-based flame retardants and has the potential to show the principle feasibility of the sol-gel approach in flame retardancy of textiles. It therefore lays the groundwork for tailoring sol-gel layers from newly synthesized sol-gel precursors containing nitrogen and phosphorous groups.
Background and purpose: Transapical aortic valve replacement (TAVR) is a recent minimally invasive surgical treatment technique for elderly and high-risk patients with severe aortic stenosis. In this paper,a simple and accurate image-based method is introduced to aid the intra-operative guidance of TAVR procedure under 2-D X-ray fluoroscopy.
Methods: The proposed method fuses a 3-D aortic mesh model and anatomical valve landmarks with live 2-D fluoroscopic images. The 3-D aortic mesh model and landmarks are reconstructed from interventional X-ray C-arm CT system, and a target area for valve implantation is automatically estimated using these aortic mesh models.Based on template-based tracking approach, the overlay of visualized 3-D aortic mesh model, land-marks and target area of implantation is updated onto fluoroscopic images by approximating the aortic root motion from a pigtail catheter motion without contrast agent. Also, a rigid intensity-based registration algorithm is used to track continuously the aortic root motion in the presence of contrast agent.Furthermore, a sensorless tracking of the aortic valve prosthesis is provided to guide the physician to perform the appropriate placement of prosthesis into the estimated target area of implantation.
Results: Retrospective experiments were carried out on fifteen patient datasets from the clinical routine of the TAVR. The maximum displacement errors were less than 2.0 mm for both the dynamic overlay of aortic mesh models and image-based tracking of the prosthesis, and within the clinically accepted ranges. Moreover, high success rates of the proposed method were obtained above 91.0% for all tested patient datasets.
Conclusion: The results showed that the proposed method for computer-aided TAVR is potentially a helpful tool for physicians by automatically defining the accurate placement position of the prosthesis during the surgical procedure.
Anhaltend hohe Mitarbeitermotivation ist die zentrale Voraussetzung für erfolgreichen Vertrieb. Doch viel zu häufig versuchen Unternehmen, ihre Vertriebsmitarbeiter allein durch Einzelimpulse und durchsichtige Anreizsysteme zu motivieren. Dies kann nicht gelingen. Stattdessen sind eine langfristige Perspektive und ein intelligenter Mix verschiedener Instrumente nötig.
Über mehrere Monate porträtierten sich Masterstudierende von drei Kontinenten gegenseitig über Skype. Mittels einer besonderen Zeichentechnik, der Blindzeichnung, sind zahlreiche Porträts entstanden, deren Wirkung im öffentlichen Raum und in sozialen Netzwerken untersucht wurden. Diese Porträts sind die Basis für künstlerische Arbeiten in allen Bereichen und Medien der Bildenden Kunst. Das Forschungsprojekt SkypeLab schafft so eine Verbindung der traditionellen künstlerischen Techniken mit aktuellen digitalen Technologien.
Kundenforschungsprojekte sind häufig durch einen beschränkten Fokus auf bestimmte Untersuchungsobjekte, Forschungsdesigns und Datenanalyseverfahren geprägt. Leider ist das häufig zu beobachtende Standardvorgehen nicht immer korrekt und liefert in vielen Fällen sogar fehlerhafte Ergebnisse. Die Diskussion des optimalen Untersuchungsobjekts und des geeigneten Untersuchungsdesigns sind Gegenstand des ersten Teils dieses Beitrages.
Das in Kundenforschungsprojekten häufig zu beobachtende Standardvorgehen liefert oftmals fehlerhafte Ergebnisse. Wir plädieren daher für einen "Schritt zurück", um einen ganzheitlichen Blick auf den Baukasten der Kundenforschungsinstrumente zu ermöglichen. Aufbauend auf dem ersten Beitrag in WiSt-Heft Nr. 4/2016, S. 188–193, in dem die Ausgangslage beschrieben und die ersten beiden Dimensionen der Kundenanalyse (Objekt der Forschung, und Forschungsdesign) diskutiert wurden, werden im vorliegenden zweiten Teil Aspekte der Datenanalyse thematisiert.
Current techniques for chromosome analysis need to be improved for rapid, economical identification of complex chromosomal defects by sensitive and selective visualisation. In this paper, we present a straightforward method for characterising unstained human metaphase chromosomes. Backscatter imaging in a dark-field setup combined with visible and short near-infrared spectroscopy is used to monitor morphological differences in the distribution of the chromosomal fine structure in human metaphase chromosomes. The reasons for the scattering centres in the fine structure are explained. Changes in the scattering centres during preparation of the metaphases are discussed. FDTD simulations are presented to substantiate the experimental findings. We show that local scattering features consisting of underlying spectral modulations of higher frequencies associated with a high variety of densely packed chromatin can be represented by their scatter profiles even on a sub-microscopic level. The result is independent of the chromosome preparation and structure size. This analytical method constitutes a rapid, costeffective and label-free cytogenetic technique which can be used in a standard light microscope.
The purpose of this paper is to examine the relationship between the consumers’ perception of sustainability and the application of a QR-code in stores with the focus on the information searching behavior regarding sustainable aspects. An online questionnaire was conducted with fashion students at Reutlingen University: in total, 65 students participated in the survey. Paired samples t-test and other statistical analyses were applied to test research questions. Apart from this, the research paper is based on a literature review. Furthermore, the decision was taken to use a projective method in the form of a dummy fashion fTRACE website. Key findings of the survey are that participants give sustainable aspects a higher importance with a QR-code than without one. Participants who prefer a product with detailed information experience a “positive shopping feeling” when provided with transparency via a QR-code. “Origin”, “production” and “quality” were rated of higher importance by those participants. These findings suggest that, transparency provided through the application of a QR-Code in stores influences the consumers’ perception of sustainability. Due to the small sample size of participants (65) in the study, findings of this research not generalizable to a larger population. This paper focused on the consumers’ information searching behavior regarding sustainable aspects, limiting its findings to impacts on perception of sustainability. Further research is therefore recommended.
Methacrylated gelatin and mature adipocytes are promising components for adipose tissue engineering
(2016)
In vitro engineering of autologous fatty tissue constructs is still a major challenge for the treatment of congenital deformities, tumor resections or high-graded burns. In this study, we evaluated the suitability of photo-crosslinkable methacrylated gelatin (GM) and mature adipocytes as components for the composition of three-dimensional fatty tissue constructs. Cytocompatibility evaluations of the GM and the photoinitiator Lithium phenyl-2,4,6 trimethylbenzoylphosphinate (LAP) showed no cytotoxicity in the relevant range of concentrations. Matrix stiffness of cell-laden hydrogels was adjusted to native fatty tissue by tuning the degree of crosslinking and was shown to be comparable to that of native fatty tissue. Mature adipocytes were then cultured for 14 days within the GM resulting in a fatty tissue construct loaded with viable cells expressing cell markers perilipin A and laminin. This work demonstrates that mature adipocytes are a highly valuable cell source for the composition of fatty tissue equivalents in vitro. Photo-crosslinkable methacrylated gelatin is an excellent tissue scaffold and a promising bioink for new printing techniques due to its biocompatibility and tunable properties.
Despite 30 years of Electronic Design Automation, analog IC layouts are still handcrafted in a laborious fashion today due to the complex challenge of considering all relevant design constraints. This paper presents Self-organized Wiring and Arrangement of Responsive Modules (SWARM), a novel approach addressing the problem with a multi-agent system: autonomous layout modules interact with each other to evoke the emergence of overall compact arrangements that fit within a given layout zone. SWARM´s unique advantage over conventional optimization-based and procedural approaches is its ability to consider crucial design constraints both explicitly and implicitly. Several given examples show that by inducing a synergistic flow of self-organization, remarkable layout results can emerge from SWARM’s decentralized decision-making model.
New drugs serving unmet medical needs are one of the key value drivers of research-based pharmaceutical companies. The efficiency of research and development (R&D), defined as the successful approval and launch of new medicines (output) in the rate of the monetary investments required for R&D (input), has declined since decades. We aimed to identify, analyze and describe the factors that impact the R&D efficiency. Based on publicly available information, we reviewed the R&D models of major research-based pharmaceutical companies and analyzed the key challenges and success factors of a sustainable R&D output. We calculated that the R&D efficiencies of major research-based pharmaceutical companies were in the range of USD 3.2–32.3 billion (2006–2014). As these numbers challenge the model of an innovation-driven pharmaceutical industry, we analyzed the concepts that companies are following to increase their R&D efficiencies: (A) Activities to reduce portfolio and project risk, (B) activities to reduce R&D costs, and (C) activities to increase the innovation potential. While category A comprises measures such as portfolio management and licensing, measures grouped in category B are outsourcing and risk-sharing in late-stage development. Companies made diverse steps to increase their innovation potential and open innovation, exemplified by open source, innovation centers, or crowdsourcing, plays a key role in doing so. In conclusion, research-based pharmaceutical companies need to be aware of the key factors, which impact the rate of innovation, R&D cost and probability of success. Depending on their company strategy and their R&D set-up they can opt for one of the following open innovators: knowledge creator, knowledge integrator or knowledge leverager.
Organisationen sind immer mehr gefragt, auch digitale Arbeitsumgebungen bewusst zu formen. Neue Technologien und digitale Arbeitspraktiken verlagern den Ort, an dem eine gemeinsame Identität gebildet wird, zunehmend in virtuelle Räume. Bislang fokussieren sich Führungskräfte und Change Manager jedoch zu sehr auf Dinge, die sie anfassen und plastisch gestalten können. Die Autoren erörtern daher, wie Unternehmen auch in virtuellen Arbeitswelten die organisationale Identität gestalten und aufrechterhalten können, um auf diese Weise das Change Management zu unterstützen.
Organizations are the business world´s central actors, employing multiple people who pursue collective goals while linked to an external environment. This volume is the first of two books dedicated to defining current theories of organizations and their practices. The text is filled with contributions by alumni of the ESB Business School at Reutlingen University. Part I discusses contemporary organizational forms and properties, including team aspects.
Contemporary theory and practice of organizations. - Part 2: Leading and changing the organization
(2016)
Organizations are the business world´s central actors, employing multiple people who pursue collective goals while linked to an external environment. The text is filled with contributions by alumni of the ESB Business School at Reutlingen University. Part II provides a detailed overview of key themes in modern leadership and coaching, as well as organizational intervention.
Dieses Lehr- und Übungsbuch führt in die wesentlichen Grundlagen der Festigkeitslehre ein. Es zeigt die wichtigsten Konzepte und Arbeitsabläufe eines ingenieursgerechten Festigkeitsnachweises. Besonderer Wert wird auf eine anschauliche Vermittlung des Lehrstoffs aus Sicht des Ingenieurs gelegt. Aus Gründen der Verständlichkeit wird daher auf mathematische Herleitungen weitgehend verzichtet und stattdessen der Schwerpunkt auf eine werkstoffkundliche Betrachtungsweise gelegt. Dies wird durch umfangreiche Werkstoff- und Kennwerttabellen dokumentiert.
Instead of waiting for and constantly adapting to details of political interventions, utilities need to focus on their environment from a holistic perspective. The unique position of the company - be it a local utility, a bigger player, or an international utility specializing in specitic segments - has to be the basis of goals and strategies. But without consistent translation of these goals and strategies into processes, structures, and company culture, a strategy remains pure theory. Companies need to engage in a continuing learning process. This means being willing to pass on strategies, to slow down or speed up, to work from a different angle etc.
In dieser Arbeit wird ein Modell vorgestellt, das die Planung der direkten Wiederverwendung bei der Vermietung mobiler und langlebiger Investitionsgüter in Closed-Loop Supply Chains optimiert. Insbesondere die Entwicklung von Planungsalgorithmen zur Verbesserung der Vorhersagewahrscheinlichkeit zukünftiger Rücklieferungen und deren betriebswirtschaftliche Auswirkungen für Unternehmen stehen im Vordergrund. Das Optimierungsmodell betrachtet dabei sowohl die Positionierung des Unternehmens im Innen- als auch im Außenverhältnis und liefert die Entscheidungsgrundlage für entsprechende strategische Initiativen.
Verlängerte Werkbank, Global Sourcing, Low-Cost-Country-Potenziale, Outsourcing: Seit Jahren herrscht eine inflationäre Verwendung dieser Schlagworte in den Vorstandsetagen. Der Markt für professionelle Dienstleistungen ist schon lange nicht mehr auf die Strategieberatungsbranche beschränkt. Mittlerweile gibt es fast keinen Unternehmensprozess in einem produzierenden Unternehmen mehr, der nicht hinsichtlich seiner Auslagerbarkeit an Berater oder Dienstleister geprüft wurde. Die Autorin beschreibt, warum professionelle Dienstleister zunehmend beauftragt werden und zeigt am Beispiel einer Service-Performance-Studie die Erfolgsfaktoren für die Zusammenarbeit auf. Anhand einer Collaboration Check List legt sie dar, was im täglichen Doing wichtig ist. Abschließend werden die Risiken und Chancen der Inanspruchnahme von Dienstleistungen aus Kunden- und Lieferantenperspektive beleuchtet.
We present a fully automatic approach to real-time 3D face reconstruction from monocular in-the-wild videos. We use a 3D morphable face model to obtain a semi-dense shape and combine it with a fast median-based super-resolution technique to obtain a high-fidelity textured 3D face model. Our system does not need prior training and is designed to work in uncontrolled scenarios.
The book provides suggestions on how to start using bionic optimization methods, including pseudo-code examples of each of the important approaches and outlines of how to improve them. The most efficient methods for accelerating the studies are discussed. These include the selection of size and generations of a study’s parameters, modification of these driving parameters, switching to gradient methods when approaching local maxima, and the use of parallel working hardware.
Bionic optimization means finding the best solution to a problem using methods found in nature. As evolutionary strategies and particle swarm optimization seem to be the most important methods for structural optimization, we primarily focus on them. Other methods such as neural nets or ant colonies are more suited to control or process studies, so their basic ideas are outlined in order to motivate readers to start using them.
A set of sample applications shows how bionic optimization works in practice. From academic studies on simple frames made of rods to earthquake-resistant buildings, readers follow the lessons learned, difficulties encountered and effective strategies for overcoming them. For the problem of tuned mass dampers, which play an important role in dynamic control, changing the goal and restrictions paves the way for multi-objective-optimization. As most structural designers today use commercial software such as FE-Codes or CAE systems with integrated simulation modules, ways of integrating bionic optimization into these software packages are outlined and examples of typical systems and typical optimization approaches are presented.
The closing section focuses on an overview and outlook on reliable and robust as well as on multi-objective optimization, including discussions of current and upcoming research topics in the field concerning a unified theory for handling stochastic design processes.
Die Erfindung betrifft einen Energieübertrager (100) zur induktiven Energieübertragung von einem primären Schaltkreis (10) des Energieübertragers (100) an eine erste (5) und eine zweite (15) Spannungsdomäne eines sekundären Schaltkreises (20) des Energieübertragers (100) und zur Informationsübertragung vom sekundären Schaltkreis (20) zum primären Schaltkreis (10). Dabei umfasst der Energieübertrager (100): – einen Transformator (30), über den der primäre Schaltkreis (10) und der sekundäre Schaltkreis (20) induktiv miteinander gekoppelt sind und über den sowohl die Energieübertragung als auch die Informationsübertragung erfolgt; und – ein Amplitudenmodulationsmodul (50) zum Modulieren der Strom- und/oder Spannungsamplitude im sekundären Schaltkreis (20) mit Hilfe eines Amplitudenmodulationsschalters (55), wobei der Amplitudenmodulationsschalter (55) zwischen der ersten (5) und zweiten (15) Spannungsdomäne des sekundären Schaltkreises (20) angeordnet ist und ausgelegt ist, durch Öffnen und Schließen des Amplitudenmodulationsschalters (55) die Strom- und/oder Spannungsamplitude im primären Schaltkreis (10) zu ändern, um somit Information vom sekundären Schaltkreis (20) zum primären Schaltkreis (10) zu übertragen. Die vorliegende Erfindung betrifft ferner einen Gate-Treiber zum Schalten eines Leistungsschalters (500) und ein Verfahren zur induktiven Übertragung von Energie und zur kombinierten Informationsübertragung.
Die vorliegende Erfindung betrifft ein Verfahren zur Regelung einer Totzeit in einem Synchronwandler (100), in welchem ein zyklisches Schalten eines Steuerschalters (2) und eines Synchronschalters (3) erfolgen, wobei der Steuerschalter (2) mittels eines ersten Schaltsignals (S1) und der Synchronschalter (3) mittels eines zweiten Schaltsignals (S2) geschaltet werden. Das Verfahren umfasst ein Erfassen und Vorhalten eines Spannungswertes, welcher eine Spannung (VSW) über den Synchronschalter (3) zu einem bestimmten Zeitpunkt beschreibt, und ein Anpassen des ersten und/oder zweiten Schaltsignals (S1, S2) für einen folgenden Zyklus basierend auf dem vorgehaltenen Spannungswert.
Es werden eine elektronische Treiberschaltung und ein Ansteuerverfahren offenbart. Die Treiberschaltung weist einen Ausgang auf; einen ersten Ausgangstransistor mit einem Steuerknoten und einer Laststrecke, wobei die Laststrecke zwischen den Ausgang und einen ersten Versorgungsknoten geschaltet ist; einen Spannungsregler, der dazu ausgebildet ist, eine Spannung über der Laststrecke des ersten Ausgangstransistors zu steuern; und einen ersten Treiber, der dazu ausgebildet ist, den ersten Ausgangstransistor in Abhängigkeit von einem ersten Steuersignal anzusteuern.
Mit dem Betrieb von KWK-Anlagen lässt sich nennenswert Primärenergie einsparen. KWK-Anlagen werden aus diesem Grund aufgrund verschiedener Gesetze und Richtlinien gefördert. Zum wirtschaftlichen Betrieb einer KWK-Anlage ist es erforderlich, den größtmöglichen Teil des erzeugten elektrischen Stroms entweder selbst zu verbrauchen oder an Dritte (Mieter, Wohnungseigentümer…) zu verkaufen. Mit dem KWKG 2016 werden größere KWK-Anlagen interessant, und Anlagen mit geringerer jährlicher Laufzeit können sich sogar wirtschaftlicher darstellen als reine Grundlastanlagen.
Motivation
(2016)
Since human beings started to work consciously with their environment, they have tried to improve the world they were living in. Early use of tools, increasing quality of these tools, use of new materials, fabrication of clay pots, and heat treatment of metals: all these were early steps of optimization. But even on lower levels of life than human beings or human society, we find optimization processes. The organization of a herd of buffalos to face their enemies, the coordinated strategies of these enemies to isolate some of the herd’s members, and the organization of bird swarms on their long flights to their winter quarters: all these social interactions are optimized strategies of long learning processes, most of them the result of a kind of collective intelligence acquired during long selection periods.
In this chapter we introduce methods to improve mechanical designs by bionic methods. In most cases we assume that a general idea of the part or system is given by a set of data or parameters. Our task is to modify these free parameters so that a given goal or objective is optimized without violation of any of the existing restrictions.
We have seen that bionic optimization can be a powerful tool when applied to problems with non-trivial landscapes of goals and restrictions. This, in turn, led us to a discussion of useful methodologies for applying this optimization to real problems. On the other hand, it must be stated that each optimization is a time consuming process as soon as the problem expands beyond a small number of free parameters related to simple parabolic responses. Bionic optimization is not a quick approach to solving complex questions within short times. In some cases it has the potential to fail entirely, either by sticking to local maxima or by random exploration of the parameter space without finding any promising solutions. The following sections present some remarks on the efficiency and limitations users must be aware of. They aim to increase the knowledge base of using and encountering bionic optimization. But they should not discourage potential users from this promising field of powerful strategies to find good or even the best possible designs.
Application to CAE systems
(2016)
Due to the broad acceptance of CAD-systems based on 3D solids, the geometric data of all common CAE (Computer-Aided Engineering) software, at least in mechanical engineering, are based on these solids. We use solid models, where the space filled by material is defined in a simple and easily useable way. Solid models allow for the development of automated meshers that transform solid volumes into finite elements. Even after some unacceptable initial trials, users are able to generate meshes of non-trivial geometries within minutes to hours, instead of days or weeks. Once meshing had no longer been the cost limiting factor of finite element studies, numerical simulation became a tool for smaller industries as well.
Due to the broad acceptance of CAD-systems based on 3D solids , the geometric data of all common CAE (Computer-Aided Engineering) software, at least in mechanical engineering, are based on these solids. We use solid models , where the space filled by material is defined in a simple and easily useable way. Solid models allow for the development of automated meshers that transform solid volumes into finite elements. Even after some unacceptable initial trials, users are able to generate meshes of non-trivial geometries within minutes to hours, instead of days or weeks. Once meshing had no longer been the cost limiting factor of finite element studies, numerical simulation became a tool for smaller industries as well.
In the early days of automated meshing development, there were discussions over the use of tetragonal (Fig. 4.1) or hexagonal based meshes. But, after a short period of time, it became evident, that there were and will always be many problems using automated meshers to generate hexagonal elements . So today nearly all automated 3D-meshing systems use tetragonal elements .
To illustrate the power and the pitfalls of Bionic Optimization, we will show some examples spanning classes of applications and employing various strategies. These applications cover a broad range of engineering tasks. Nevertheless, there is no guarantee that our experiences and our examples will be sufficient to deal with all questions and issues in a comprehensive way. As general rule it might be stated, that for each class of problems, novices should begin with a learning phase. So, in this introductory phase, we use simple and quick examples, e.g., using small FE-models, linear load cases, short time intervals and simple material models. Here beginners within the Bionic Optimization community can learn which parameter combinations to use. In Sect. 3.3 we discuss strategies for optimization study acceleration. Making use of these parameters as starting points is one way to set the specific ranges, e.g., number of parents and kids, crossing, mutation radii and, numbers of generations. On the other hand, these trial runs will doubtless indicate that Bionic Optimization needs large numbers of individual designs, and considerable time and computing power. We recommend investing enough time preparing each task in order to avoid the frustration should large jobs fail after long calculation times.
Detecting the adherence of driving rules in an energy-efficient, safe and adaptive driving system
(2016)
An adaptive and rule-based driving system is being developed that tries to improve the driving behavior in terms of the energy-efficiency and safety by giving recommendations. Therefore, the driving system has to monitor the adherence of driving rules by matching the rules to the driving behavior. However, existing rule matching algorithms are not sufficient, as the data within a driving system is changing frequently. In this paper a rule matching algorithm is introduced that is able to handle frequently changing data within the context of the driving system. 15 journeys were used to evaluate the performance of the rule matching algorithms. The results showed that the introduced algorithm outperforms existing algorithms in the context of the driving system. Thus, the introduced algorithm is suited for matching frequently changing data against rules with a higher performance, why it will be used in the driving system for the detection of broken energy-efficiency of safety-relevant driving rules.
In the last decades, several driving systems were developed to improve the driving behaviour in energy efficiency or safety. However, these driving systems cover either the area of energy-efficiency or safety. Furthermore, they do not consider the stress level of the driver when showing a recommendation, although stress can lead to an unsafe or inefficient driving behaviour. In this paper, an approach is presented to consider the driver stress level in a driving system for safe and energy-efficient driving behaviour. The driving system tries to suppress a recommendation when the driver is in stress in order not to stress the driver additionally with recommendations in a stressful driving situation. This can lead to an increase in the road safety and in the user acceptance of the driving system, as the driver is not getting bothered or stressed by the driving system.
The evaluation of the approach showed, that the driving system
is able to show recommendations to the driver, while also reacting
to a high stress level by suppressing recommendations in
order not to stress the driver additionally.
Nowadays there is a rich diversity of sleep monitoring systems available on the market. They promise to offer information about sleep quality of the user by recording a limited number of vital signals, mainly heart rate and body movement. Typically, fitness trackers, smart watches, smart shirts, smartphone applications or patches do not provide access to the raw sensor data. Moreover, the sleep classification algorithm and the agreement ratio with the gold standard, polysomnography (PSG) are not disclosed. Some commercial systems record and store the data on the wearable device, but the user needs to transfer and import it into specialised software applications or return it to the doctor, for clinical evaluation of the data set. Thus an immediate feedback mechanism or the possibility of remote control and supervision are lacking. Furthermore, many such systems only distinguish between sleep and wake states, or between wake, light sleep and deep sleep. It is not always clear how these stages are mapped to the four known sleep stages: REM, NREM1, NREM2, NREM3-4. [1] The goal of this research is to find a reduced complexity method to process a minimum number of bio vital signals, while providing accurate sleep classification results. The model we propose offers remote control and real time supervision capabilities, by using Internet of Things (IoT) technology. This paper focuses on the data processing method and the sleep classification logic. The body sensor network representing our data acquisition system will be described in a separate publication. Our solution showed promising results and a good potential to overcome the limitations of existing products. Further improvements will be made and subjects with different age and health conditions will be tested.
The increasing emergence of cyber-physical systems (CPS) and a global crosslinking of these CPS to cyber-physical production systems (CPPS) are leading to fundamental changes of future work and logistic systems requiring innovative methods to plan, control and monitor changeable production systems and new forms of human-machine-collaboration. Particularly logistic systems have to obey the versatility of CPPS and will be transferred to so-called cyber physical logistic systems, since the logistical networks will underlie the requirements of constant changes initiated by changeable production systems. This development is driven and enhanced by increasingly volatile and globalized market and manufacturing environments combined with a high demand for individualized products and services. Also nowadays mainly used centralized control systems are pushed to their limits regarding their abilities to deal with the arising complexity to plan, control and monitor changeable work and logistic systems. Decentralized control systems bear the potential to cope with these challenges by distributing the required operations on various nodes of the resulting decentralized control system.
Learning factories, like the ESB Logistics Learning Factory at ESB Business School (Reutlingen University), provide a wide range of possibilities to develop new methods and innovative technical solutions in a risk-free and close-to-reality factory environment and to transfer knowledge as well as specific competences into the training of students and professionals. To intensify the research and training activities in the field of future work and logistics systems, ESB Business School is transferring its existing production system into a CPPS involving decentralized planning, control and monitoring methods and systems, human-machine-collaboration as well as technical assistance systems for changeable work and logistics systems.
A seamless convergence of the digital and physical factory aiming in personalized Product Emergence Process (PPEP) for smart products within ESB Logistics Learning Factory at Reutlingen University.
A completely new business model with reference to Industrie4.0 and facilitated by 3D experience software in today's networked society in which customers expect immediate responses, delightful experience and simple solutions is one of the mission scenarios in the ESB Logistics Learning Factory at ESB Business School (Reutlingen University).
The business experience platform provides software solutions for every organization in the company respectively in the factory. An interface with dashboards, project management apps, 3D - design and construction apps with high end visualization, manufacturing and simulation apps as well as intelligence and social network apps in a collaborative interactive environment help the user to learn the creation of a value end to end process for a personalized virtual and later real produced product.
Instead of traditional ways of working and a conventional operating factory real workers and robots work semi-intuitive together. Centerpiece in the self-planned interim factory is the smart personalized product, uniquely identifiable and locatable at all times during the production process – a scooter with an individual colored mobile phone – holder for any smart phone produced with a 3D printer in lot size one. Smart products have in the future solutions incorporated internet based services – designed and manufactured - at the costs of mass products. Additionally the scooter is equipped with a retrievable declarative product memory. Monitoring and control is handled by sensor tags and a raspberry positioned on the product. The engineering design and implementation of a changeable production system is guided by a self-execution system that independently find amongst others esplanade workplaces.
The imparted competences to students and professionals are project management method SCRUM, customization of workflows by Industrie4.0 principles, the enhancements of products with new personalized intelligent parts, electrical and electronic selfprogrammed components and the control of access of the product memory information, to plan in a digital engineering environment and set up of the physical factory to produce customer orders. The gained action-orientated experience refers to the chances and requirements for holistic digital and physical systems.
Context: An experiment-driven approach to software product and service development is gaining increasing attention as a way to channel limited resources to the efficient creation of customer value. In this approach, software capabilities are developed incrementally and validated in continuous experiments with stakeholders such as customers and users. The experiments provide factual feedback for guiding subsequent development.
Objective: This paper explores the state of the practice of experimentation in the software industry. It also identifies the key challenges and success factors that practitioners associate with the approach.
Method: A qualitative survey based on semi-structured interviews and thematic coding analysis was conducted. Ten Finnish software development companies, represented by thirteen interviewees, participated in the study.
Results: The study found that although the principles of continuous experimentation resonated with industry practitioners, the state of the practice is not yet mature. In particular, experimentation is rarely systematic and continuous. Key challenges relate to changing the organizational culture, accelerating the development cycle speed, and finding the right measures for customer value and product success. Success factors include a supportive organizational culture, deep customer and domain knowledge, and the availability of the relevant skills and tools to conduct experiments.
Conclusions: It is concluded that the major issues in moving towards continuous experimentation are on an organizational level; most significant technical challenges have been solved. An evolutionary approach is proposed as a way to transition towards experiment-driven development.