Refine
Document Type
- Book chapter (155)
- Conference proceeding (145)
- Journal article (131)
- Book (27)
- Anthology (13)
- Doctoral Thesis (2)
Is part of the Bibliography
- yes (473)
Institute
- Informatik (205)
- ESB Business School (145)
- Texoversum (51)
- Life Sciences (34)
- Technik (34)
- Zentrale Einrichtungen (4)
Publisher
- Springer (473) (remove)
Programmable nano-bio interfaces driven by tuneable vertically configured nanostructures have recently emerged as a powerful tool for cellular manipulations and interrogations. Such interfaces have strong potential for ground-breaking advances, particularly in cellular nanobiotechnology and mechanobiology. However, the opaque nature of many nanostructured surfaces makes non-destructive, live-cell characterization of cellular behavior on vertically aligned nanostructures challenging to observe. Here, a new nanofabrication route is proposed that enables harvesting of vertically aligned silicon (Si) nanowires and their subsequent transfer onto an optically transparent substrate, with high efficiency and without artefacts. We demonstrate the potential of this route for efficient live-cell phase contrast imaging and subsequent characterization of cells growing on vertically aligned Si nanowires. This approach provides the first opportunity to understand dynamic cellular responses to a cell-nanowire interface, and thus has the potential to inform the design of future nanoscale cellular manipulation technologies.
Historically, research and development (R&D) in the pharmaceutical sector has predominantly been an in-house activity. To enable investments for game changing late-stage assets and to enable better and less costly go/no-go decisions, most companies have employed a fail early paradigm through the implementation of clinical proof-of-concept organizations. To fuel their pipelines, some pioneers started to complement their internal R&D efforts through collaborations as early as the 1990s. In recent years, multiple extrinsic and intrinsic factors induced an opening for external sources of innovation and resulted in new models for open innovation, such as open sourcing, crowdsourcing, public–private partnerships, innovations centres, and the virtualization of R&D. Three factors seem to determine the breadth and depth regarding how companies approach external innovation: (1) the company’s legacy, (2) the company’s willingness and ability to take risks and (3) the company’s need to control IP and competitors. In addition, these factors often constitute the major hurdles to effectively leveraging external opportunities and assets. Conscious and differential choices of the R&D and business models for different companies and different divisions in the same company seem to best allow a company to fully exploit the potential of both internal and external innovations.
Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises, for instance, the optimization of specific activities in the software lifecycle as well as the creation of organizational awareness and project culture. In the course of conducting a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Software Quality Management (SQM) being of certain relevance in SPI programs. In this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were classified as addressing SPI in the context of SQM (including testing). From the main study’s result set, 92 papers were selected for an in-depth systematic review to study the contributions and to develop an initial picture of how these topics are addressed in SPI. Our findings show a fairly pragmatic contribution set in which different solutions are proposed, discussed, and evaluated. Among others, our findings indicate a certain reluctance towards standard quality or (test) maturity models and a strong focus on custom review, testing, and documentation techniques, whereas a set of five selected improvement measures is almost equally addressed.
Near-Data Processing (NDP) is a key computing paradigm for reducing the ever growing time and energy costs of data transport versus computations. With their flexibility, FPGAs are an especially suitable compute element for NDP scenarios. Even more promising is the exploitation of novel and future non-volatile memory (NVM) technologies for NDP, which aim to achieve DRAM-like latencies and throughputs, while providing large capacity non-volatile storage.
Experimentation in using FPGAs in such NVM-NDP scenarios has been hindered, though, by the fact that the NVM devices/FPGA boards are still very rare and/or expensive. It thus becomes useful to emulate the access characteristics of current and future NVMs using off-the-shelf DRAMs. If such emulation is sufficiently accurate, the resulting FPGA-based NDP computing elements can be used for actual full-stack hardware/software benchmarking, e.g., when employed to accelerate a database.
For this use, we present NVMulator, an open-source easy-to-use hardware emulation module that can be seamlessly inserted between the NDP processing elements on the FPGA and a conventional DRAM-based memory system. We demonstrate that, with suitable parametrization, the emulated NVM can come very close to the performance characteristics of actual NVM technologies, specifically Intel Optane. We achieve 0.62% and 1.7% accuracy for cache line sized accesses for read and write operations, while utilizing only 0.54% of LUT logic resources on a Xilinx/AMD AU280 UltraScale+ FPGA board. We consider both file-system as well as database access patterns, examining the operation of the RocksDB database when running on real or emulated Optane-technology memories.
Hypermedia as the Engine of Application State (HATEOAS) is one of the core constraints of REST. It refers to the concept of embedding hyperlinks into the response of a queried or manipulated resource to show a client possible follow-up actions and transitions to related resources. Thus, this concept aims to provide a client with a navigational support when interacting with a Web-based application. Although HATEOAS should be implemented by any Web-based API claiming to be RESTful, API providers tend to offer service descriptions in place of embedding hyperlinks into responses. Instead of relying on a navigational support, a client developer has to read the service description and has to identify resources and their URIs that are relevant for the interaction with the API. In this paper, we introduce an approach that aims to identify transitions between resources of a Web-based API by systematically analyzing the service description only. We devise an algorithm that automatically derives a URI Model from the service description and then analyzes the payload schemas to identify feasible values for the substitution of path parameters in URI Templates. We implement this approach as a proxy application, which injects hyperlinks representing transitions into the response payload of a queried or manipulated resource. The result is a HATEOAS-like navigational support through an API. Our first prototype operates on service descriptions in the OpenAPI format. We evaluate our approach using ten real-world APIs from different domains. Furthermore, we discuss the results as well as the observations captured in these tests.
Data analytics tasks on large datasets are computationally intensive and often demand the compute power of cluster environments. Yet, data cleansing, preparation, dataset characterization and statistics or metrics computation steps are frequent. These are mostly performed ad hoc, in an explorative manner and mandate low response times. But, such steps are I/O intensive and typically very slow due to low data locality, inadequate interfaces and abstractions along the stack. These typically result in prohibitively expensive scans of the full dataset and transformations on interface boundaries.
In this paper, we examine R as analytical tool, managing large persistent datasets in Ceph, a wide-spread cluster file-system. We propose nativeNDP – a framework for Near Data Processing that pushes down primitive R tasks and executes them in-situ, directly within the storage device of a cluster-node. Across a range of data sizes, we show that nativeNDP is more than an order of magnitude faster than other pushdown alternatives.
Die klassischen Vertriebsaufgaben verändern sich intensiv und schnell. Vertriebsmanager benötigen dringend neue strategische Ansätze, wie sie künftig Kundenkontakte gestalten, Distributionskanäle steuern und effektiver verkaufen können. Eine aktuelle Studie gibt Aufschluss, wie sich Unternehmen auf den Strukturwandel einstellen können.
Vor dem Hintergrund, dass aktuell erstmals konzeptionelle Grundlagen für ein Nachhaltigkeitsmanagement in außeruniversitären Forschungseinrichtungen entwickelt worden sind, sollen nun diese Ergebnisse in die Konzeption eines interdisziplinären und organisationsübergreifenden Ansatzes zur Integration von Nachhaltigkeitsaspekten in Lehre und Weiterbildung einfließen. Die empirische Grundlage bildet eine qualitative, organisationsethnografische Fallstudie, in der die Autoren Deutungen von Experten und Expertinnen sowie deren Betriebs- und Alltagswissen in Bezug auf ein nachhaltiges Personalmanagement untersuchen. Auf Basis der daraus entwickelten praktischen Implementierungsmöglichkeiten in außeruniversitären Forschungseinrichtungen soll nun ein Ansatz entwickelt werden, um die Ergebnisse auf die Lehre in Hochschulen und die interorganisatorische Weiterbildung von Forschungseinrichtungen zu übertragen.
Das Ziel dieses Papiers ist es zu verstehen, inwieweit Musik und Mode voneinander abhängig und miteinander interagieren, wobei der Schwerpunkt auf der Entwicklung von Musik- und Modetrends im Zeitraum von 1950 bis heute liegt. Darüber hinaus soll dem Leser ein Einblick darin ermöglicht werden, ob die zur Verfügung stehende Technologie die Entwicklung und den Zugang zu Musik und Mode in Zukunft beeinflusst. Die Recherche für dieses Papier erforderte die Verwendung von Sekundärquellen, einschließlich Bibliotheks- und Online-Recherchen. Das Ziel war es, Informationen über die frühere und aktuelle Entwicklung von Musik und Mode zu sammeln. Diese Methoden waren die besten Alternativen zu Sekundärquellen, da sie zuverlässige Ergebnisse lieferten und so die Genauigkeit der gesammelten Daten erhöhten. Sie waren jedoch auch begrenzt, da vor allem Daten für die Mode- und Musikentwicklung der Nullerjahre begrenzt waren. Dies ist durch das Hauptergebnis erklärbar, dass die Entwicklung dieser Zeit nicht so deutlich ist wie die der früheren Zeiten, in denen ein Modetrend mit einem neuen Musikgenre oder Hit einherging, was bedeutet, dass Mode und Musik in gewissem Maße korrelieren, aber durch eine Reaktivierung der Musik- und Modetrends der Vorjahre ohne neue Erfindungen gekennzeichnet ist.
Hip-hop culture defines itself through four central pillars: DJing, MCing, breakdancing and graffiti, but a fifth one, fashion, may be in the coming. Hip-hop has become the most popular music genre, and the influence it has on society is undebatable. But as hip-hop artists increasingly underpin their music with visual components, like music videos, the question arises if that has an influence on the fashion industry. This chapter clarifies which factors may determine a fashion business impact and discusses differences between mainstream hip-hop artists and the ones that are active in the fashion industry as well. The focus lays on the way and amount fashion is presented in the music videos. 24 music videos were analyzed, thereof 15 popular records from the past three years and nine of artists that are already considered as fashion influential. Additionally, a fashion influence index was created to compare the degree of fashion between the music videos. Numbers of styles, recognized brands, fashion related song verses, fashion related description box mentions and articles about the fashion in the music video were noted. Findings reveal that the number of outfits shown in the video did not have a direct link to the amount of traffic it produces in fashion media. The artists that are considered influential in the fashion industry, name brands in their song lyrics more often and show brand logos more frequent in their music videos than others. Though over the observed years, for the mainstream hip-hop artists, a rise in fashion awareness can be seen through a higher number of styles, recognizable brands and fashion related verses in the lyrics.
Music is omnipresent and an important factor for cultural and social development. Thus, the connection between music and fashion has rarely been contemplated yet. In particular, this research paper is concerned with the connection between music and fashion communication, with special interest to its emotional background in the context of neuromarketing. The research question of how music affects the perception of a fashion brand, when regarded as emotional stimulus in the context of neuromarketing, has been investigated by researching existing literature. Without attempting to explain neurological processes to their core, this paper tries to give an overview of how music generates emotion and how this can be used for branding activities. This led to the result that music causes positive emotional response of the consumer, when used in marketing actions. Through emotional response, the perception, identity, and recall of a brand are strongly influenced.
Aim of this paper is to provide an understanding to which extent music and fashion interdepend and interact referring to the music and fashion trend development, focusing the period from 1950 till today. It further helps the reader to gain an insight if the technology provided influences the development and the access of music and fashion in future. The research for this paper required the use of secondary sources including library and online research. The goal was to gather information about the former and current development of music and fashion. These methods were the best alternatives of secondary sources as they provided trusted results thus enhancing the accuracy of the data being collected. But however they were also limited since mainly data for the fashion and music development of the noughties were limited. This is explainable by the key finding that the development of this time is not as distinct as the one of the former times, when a fashion trend came along with a new music genre or hit, which implies that fashion and music correlate to a certain extent, but characterized by a reactivation of the music and fashion trends of previous times without any new inventions.
This paper presents a novel multi-modal CNN architecture that exploits complementary input cues in addition to sole color information. The joint model implements a mid-level fusion that allows the network to exploit cross modal interdependencies already on a medium feature-level. The benefit of the presented architecture is shown for the RGB-D image understanding task. So far, state-of-the-art RGB-D CNNs have used network weights trained on color data. In contrast, a superior initialization scheme is proposed to pre-train the depth branch of the multi-modal CNN independently. In an end-to-end training the network parameters are optimized jointly using the challenging Cityscapes dataset. In thorough experiments, the effectiveness of the proposed model is shown. Both, the RGB GoogLeNet and further RGB-D baselines are outperformed with a significant margin on two different tasks: semantic segmentation and object detection. For the latter, this paper shows how to extract object level groundtruth from the instance level annotations in Cityscapes in order to train a powerful object detector.
Automatic segmentation is essential for the brain tumor diagnosis, disease prognosis, and follow-up therapy of patients with gliomas. Still, accurate detection of gliomas and their sub-regions in multimodal MRI is very challenging due to the variety of scanners and imaging protocols. Over the last years, the BraTS Challenge has provided a large number of multi-institutional MRI scans as a benchmark for glioma segmentation algorithms. This paper describes our contribution to the BraTS 2022 Continuous Evaluation challenge. We propose a new ensemble of multiple deep learning frameworks namely, DeepSeg, nnU-Net, and DeepSCAN for automatic glioma boundaries detection in pre-operative MRI. It is worth noting that our ensemble models took first place in the final evaluation on the BraTS testing dataset with Dice scores of 0.9294, 0.8788, and 0.8803, and Hausdorf distance of 5.23, 13.54, and 12.05, for the whole tumor, tumor core, and enhancing tumor, respectively. Furthermore, the proposed ensemble method ranked first in the final ranking on another unseen test dataset, namely Sub-Saharan Africa dataset, achieving mean Dice scores of 0.9737, 0.9593, and 0.9022, and HD95 of 2.66, 1.72, 3.32 for the whole tumor, tumor core, and enhancing tumor, respectively.
Social networks, smart portable devices, Internet of Things (IoT) on base of technologies like analytics for big data and cloud services are emerging to support flexible connected products and agile services as the new wave of digital transformation. Biological metaphors of living and adaptable ecosystems with service-oriented enterprise architectures provide the foundation for self-optimizing and resilient run-time environments for intelligent business services and related distributed information systems. We are extending Enterprise Architecture (EA) with mechanisms for flexible adaptation and evolution of information systems having distributed IoT and other micro-granular digital architecture to support next digitization products, services, and processes. Our aim is to support flexibility and agile transformation for both IT and business capabilities through adaptive digital enterprise architectures. The present research paper investigates additionally decision mechanisms in the context of multi-perspective explorations of enterprise services and Internet of Things architectures by extending original enterprise architecture reference models with state of art elements for architectural engineering and digitization.
Decentralized energy systems are characterized by an ad hoc planing. The missing integration of energy objectives into business strategy creates difficulties resulting in inefficient energy architectures and decisions. Practice-proven methods such as balanced scorecard, enterprise architecture management and value network approach supports the transformation path towards an effective decentralized system. The methods are evaluated based on a case study. Managing multi-dimensionality, high complexity and multiple actors are the main drivers for an effective and efficient energy management system. The underlying basis to gain the positive impacts of these methods on decentralized corporate energy systems is digitization of energy data and processes.
Rational strain engineering requires solid testing of phenotypes including productivity and ideally contributes thereby directly to our understanding of the genotype-phenotype relationship. Actually, the test step of the strain engineering cycle becomes the limiting step, as ever advancing tools for generating genetic diversity exist. Here, we briefly define the challenge one faces in quantifiying phenotypes and summarize existing analytical techniques that partially overcome this challenge. We argue that the evolution of volatile metabolites can be used as proxy for cellular metabolism. In the simplest case, the product of interest is a volatile (e.g., from bulk alcohols to special fragrances) that is directly quantified over time. But also nonvolatile products (e.g., from bulk long-chain fatty acids to natural products) require major flux rerouting that result potentially in altered volatile production. While alternative techniques for volatile determination exist, rather few can be envisaged for medium to high-throughput analysis required for phenotype testing. Here, we contribute a detailed protocol for an ion mobility spectrometry (IMS) analysis that allows volatile metabolite quantification down to the ppb range. The sensivity can be exploited for small-scale fermentation monitoring. The insights shared might contribute to a more frequent use of IMS in biotechnology, while the experimented aspects are of general use for researchers interested in volatile monitoring.
Motivation
(2016)
Since human beings started to work consciously with their environment, they have tried to improve the world they were living in. Early use of tools, increasing quality of these tools, use of new materials, fabrication of clay pots, and heat treatment of metals: all these were early steps of optimization. But even on lower levels of life than human beings or human society, we find optimization processes. The organization of a herd of buffalos to face their enemies, the coordinated strategies of these enemies to isolate some of the herd’s members, and the organization of bird swarms on their long flights to their winter quarters: all these social interactions are optimized strategies of long learning processes, most of them the result of a kind of collective intelligence acquired during long selection periods.
Companies are continuously changing their strategy, processes, and information systems to benefit from the digital transformation. Controlling the digital architecture and governance is the fundamental goal. Enterprise Governance, Risk and Compliance (GRC) systems are vital for managing digital risks threatening in modern enterprises from many different angles. The most significant constituent to GRC systems is the definition of controls that is implemented on different layers of a digital Enterprise Architecture (EA). As part of the compliant aspect of GRC, the effectiveness of these controls is assessed and reported to relevant management bodies within the enterprise. In this paper, we present a metamodel which links controls to the affected elements of a digital EA and supplies a way of expressing associated assessment techniques and results. We complement a metamodel with an expository instantiation of a control compliance cockpit in an international insurance enterprise.
New or adapted digital business models have huge impacts on Enterprise Architectures (EA) and require them to become more agile, flexible, and adaptable. All these changes are happening frequently and are currently not well documented. An EA consists of a lot of elements with manifold relationships between them. Thus changing the business model may have multiple impacts on other architectural elements. The EA engineering process deals with the development, change and optimization of architectural elements and their dependencies. Thus an EA provides a holistic view for both business and IT from the perspective of many stakeholders, which are involved in EA decision-making processes. Different stakeholders have specific concerns and are collaborating today in often unclear decision-making processes. In our research we are investigating information from collaborative decision-making processes to support stakeholders in taking current decisions. In addition we provide all information necessary to understand how and why decisions were taken. We are collecting the decision-related information automatically to minimize manual time intensive work as much as possible. The core contribution of our research extends a decisional metamodel, which links basic decisions with architectural elements and extends them with an associated decisional case context. Our aim is to support a new integral method for multi perspective and collaborative decision-making processes. We illustrate this by a practice-relevant decision-making scenario for Enterprise Architecture Engineering.
An important shift in software delivery is the definition of a cloud service as an independently deployable unit by following the microservices architectural style. Container virtualization facilitates development and deployment by ensuring independence from the runtime environment. Thus, cloud services are built as container based systems - a set of containers that control the lifecycle of software and middleware components. However, using containers leads to a new paradigm for service development and operation: Self service environments enable software developers to deploy and operate container based systems on their own - you build it, you run it. Following this approach, more and more operational aspects are transferred towards the responsibility of software developers. In this work, we propose a concept for self-adaptive cloud services based on container virtualization in line with the microservices architectural style and present a model-based approach that assists software developers in building these services. Based on operational models specified by developers, the mechanisms required for self-adaptation are automatically generated. As a result, each container automatically adapts itself in a reactive, decentralized manner. We evaluate a prototype which leverages the emerging TOSCA standard to specify operational behavior in a portable manner.
Mode & Musik
(2023)
Dieses Buch wird das Verständnis der Leser für die Verbindungen zwischen der Musik- und der Modeindustrie erweitern. Es hebt die Herausforderungen hervor, denen sich die Modeindustrie derzeit in Bezug auf den Hyperwettbewerb, die Definition immer schnellerer Trends, sich ändernde Verbraucherwünsche usw. gegenübersieht. Die Modeindustrie wird in der Tat stark von der digitalen Revolution in der Musikindustrie beeinflusst, die das Gesicht des individuellen Musikkonsums und des sozialen Bezugs verändert hat und sich daher auch auf den Modekonsum und den sozialen Bezug auswirkt. Dieses Verständnis ist von entscheidender Bedeutung, um die Strategien eines Modeunternehmens auf die Anforderungen der modernen Modekonsumenten auszurichten.
This book showcases new and innovative approaches to biometric data capture and analysis, focusing especially on those that are characterized by non-intrusiveness, reliable prediction algorithms, and high user acceptance. It comprises the peer-reviewed papers from the international workshop on the subject that was held in Ancona, Italy, in October 2014 and featured sessions on ICT for health care, biometric data in automotive and home applications, embedded systems for biometric data analysis, biometric data analysis: EMG and ECG, and ICT for gait analysis. The background to the book is the challenge posed by the prevention and treatment of common, widespread chronic diseases in modern, aging societies. Capture of biometric data is a cornerstone for any analysis and treatment strategy. The latest advances in sensor technology allow accurate data measurement in a non-intrusive way, and in many cases it is necessary to provide online monitoring and real-time data capturing to support a patient’s prevention plans or to allow medical professionals to access the patient’s current status. This book will be of value to all with an interest in this expanding field.
Das Internet ist längst ein fester Bestandteil in den Marketing- und Vertriebsstrategien. Doch auch mit dem Einsatz von Onlinewerbung, Suchmaschinenoptimierung und der Nutzung von Social Media ist es Unternehmen oft nicht möglich, die erhoffte Aufmerksamkeit zu erhalten und die gewünschte Wirkung von Botschaften auf die Kunden zu erzielen. Mithilfe einer strukturierten Zusammenarbeit mit sogenannten Social Influencern ist es auch im B2B-Bereich möglich, ein authentisches und glaubhaftes Image aufzubauen.
Ion mobility spectrometry coupled to multi capillary columns (MCC/IMS) combines highly sensitive spectrometry with a rapid separation technique. MCC\IMS is widely used for biomedical breath analysis. The identification of molecules in such a complex sample necessitates a reference database. The existing IMS reference databases are still in their infancy and do not allow to actually identify all analytes. With a gas chromatograph coupled to a mass selective detector (GC/MSD) setup in parallel to a MCC/IMS instrumentation we may increase the accuracy of automatic analyte identification. To overcome the time-consuming manual evaluation and comparison of the results of both devices, we developed a software tool MIMA (MS-IMS-Mapper), which can computationally generate analyte layers for MCC/IMS spectra by using the corresponding GC/MSD data. We demonstrate the power of our method by successfully identifying the analytes of a seven-component mixture. In conclusion, the main contribution of MIMA is a fast and easy computational method for assigning analyte names to yet un-assigned signals in MCC/IMS data. We believe that this will greatly impact modern MCC/IMS-based biomarker research by 'giving a name' to previously detected disease-specific molecules.
Parallel applications are the computational backbone of major industry trends and grand challenges in science. Whereas these applications are typically constructed for dedicated High Performance Computing clusters and supercomputers, the cloud emerges as attractive execution environment, which provides on-demand resource provisioning and a pay-per-use model. However, cloud environments require specific application properties that may restrict parallel application design. As a result, design trade-offs are required to simultaneously maximize parallel performance and benefit from cloud-specific characteristics.
In this paper, we present a novel approach to assess the cloud readiness of parallel applications based on the design decisions made. By discovering and understanding the implications of these parallel design decisions on an application’s cloud readiness, our approach supports the migration of parallel applications to the cloud.We introduce an assessment procedure, its underlying meta model, and a corresponding instantiation to structure this multi-dimensional design space. For evaluation purposes, we present an extensive case study comprising three parallel applications and discuss their cloud readiness based on our approach.
The purpose of this research paper is to find out to which extent rap music merchandise is influencing the fashion world of today. The research design is mainly created through analysing Internet sources. The key findings of this paper describe the way rap merchandise is created and distributed nowadays. Furthermore, is explained how an idea becomes trend and how rap artists influence trend creation, especially through social media channels. The topic around rap merchandising products and strategies is a very new one, thus there is barely any literature to find. Nevertheless, trend leading online music platforms and blogs offer a lot of grey literature about the research topic. In this paper, the analysis of rap merchandise and fashion is focused on clothing items to create a better understanding in which dimension the influence of rap merchandise on the fashion world is given.
Dieser Beitrag beschreibt das Markenmanagement von Profifußballvereinen durch den Einsatz von Social Media. Um sich ein stückweit vom nichtplanbaren sportlichen Erfolg unabhängig zu machen, sollten sich Fußballvereine als Marke positionieren. Dazu steht ihnen allerdings traditionellerweise ein geringes Marketingbudget zur Verfügung. Social Media bietet Fußballvereinen die Möglichkeit, relativ kostengünstig und effektiv die eigene Marke aufzubauen und zu pflegen. Der Beitrag erläutert diesbezüglich die Notwendigkeit eines systematischen Markenmanagements, geht auf die Besonderheiten der Vermarktung eines Profifußballvereins ein und zeigt anhand von Beispielen, wie Social Media zum Markenaufbau respektive zur Markenpflege genutzt werden kann.
Facebook ist gegenwärtig das meist genutzte soziale Netzwerk weltweit. Es ist somit nicht verwunderlich, dass immer mehr Unternehmen Facebook im Rahmen ihres Marketings einsetzen. Die Integration von Facebook in das Markenmanagement avanciert zunehmend zum Erfolgsfaktor innovativer Unternehmen. Ein professionelles Markenmanagement mit diesem sozialen Netzwerk bietet die Möglichkeit, einen nachhaltigen Mehrwert zu generieren. In diesem Beitrag wird die Rolle von Facebook im Markenmanagement eruiert. Im Kontrast zum steigenden Bewusstsein der Vorteile von Marketing mit Facebook bleiben die Risiken einer inadäquaten Nutzung oftmals ungeachtet. Die übereilte und unsachgemäße Implementierung von Facebook in den Marketing-Mix kann sowohl in enormen ökonomischen Schäden als auch in einem Reputationsverlust für die Marke münden. Um dieses Risiko zu minimieren, werden im vorliegenden Beitrag Erfolgsfaktoren für den Einsatz von Facebook im Markenmanagement herausgearbeitet, die auf einer Analyse erfolgreicher Marketing-Kampagnen und Best-Practice-Beispielen basieren.
Context: Software product lines are widely used in automotive embedded software development. This software paradigm improves the quality of software variants by reuse. The combination of agile software development practices with software product lines promises a faster delivery of high quality software. However, the set up of an agile software product line is still challenging, especially in the automotive domain. Goal: This publication aims to evaluate to what extend agility fits to automotive product line engineering. Method: Based on previous work and two workshops, agility is mapped to software product line concerns. Results: This publication presents important principles of software product lines, and examines how agile approaches fit to those principles. Additionally, the principles are related to one of the four major concerns of software product line engineering: Business, Architecture, Process, and Organization. Conclusion: Agile software product line engineering is promising and can add value to existing development approaches. The identified commonalities and hindering factors need to be considered when defining a combined agile product line engineering approach.
Application systems often need to be deployed in different variants if requirements that influence their implementation, hosting, and configuration differ between customers. Therefore, deployment technologies, such as Ansible or Terraform, support a certain degree of variability modeling. Besides, modern application systems typically consist of various software components deployed using multiple deployment technologies that only support their proprietary, non-interoperable variability modeling concepts. The Variable Deployment Metamodel (VDMM) manages the deployment variability across heterogeneous deployment technologies based on a single variable deployment model. However, VDMM currently only supports modeling conditional components and their relations which is sometimes too coarse-grained since it requires modeling entire components, including their implementation and deployment configuration for each different component variant. Therefore, we extend VDMM by a more fine-grained approach for managing the variability of component implementations and their deployment configurations, e.g., if a cheap version of a SaaS deployment provides only a community edition of the software and not the enterprise edition, which has additional analytical reporting functionalities built-in. We show that our extended VDMM can be used to realize variable deployments across different individual deployment technologies using a case study and our prototype OpenTOSCA Vintner.
Managing software process evolution : traditional, agile and beyond - how to handle process change
(2016)
This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice.
Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation and addresses the questions of which process(es) to use and adapt, and how to organize process improvement programs. Subsequently, Part 2 mainly addresses process modeling. Lastly, Part 3 collects concrete approaches, experiences, and recommendations that can help to improve software processes, with a particular focus on specific lifecycle phases.
This book is aimed at anyone interested in understanding and optimizing software development tasks at their organization. While the experiences and ideas presented will be useful for both those readers who are unfamiliar with software process improvement and want to get an overview of the different aspects of the topic, and for those who are experts with many years of experience, it particularly targets the needs of researchers and Ph.D. students in the area of software and systems engineering or information systems who study advanced topics concerning the organization and management of (software development) projects and process improvements projects.
Das Essential beschäftigt sich mit der Frage, in welcher Form und in welchem Ausmaß das interne Berichtswesen beabsichtigte und unbeabsichtigte Verhaltenswirkungen bei den Beteiligten auslöst und umgekehrt selbst durch nicht intendiertes Verhalten von Beteiligten in seinen Wirkungen beeinflusst wird. Der Ansatz des „Behavioral Accounting“ wird dabei auf die spezifische Controllingaufgabe des internen Berichtswesens angewendet. Andreas Taschner erläutert, wie Berichte bei Berücksichtigung ihrer direkten und indirekten Wirkungen auf das Verhalten einzelner Betroffener zu einem wirkungsvollen Instrument der Unternehmenssteuerung werden.
Das Essential beschäftigt sich mit der Frage, in welcher Form und in welchem Ausmaß das interne Berichtswesen beabsichtigte und unbeabsichtigte Verhaltenswirkungen bei den Beteiligten auslöst und umgekehrt selbst durch nicht intendiertes Verhalten von Beteiligten in seinen Wirkungen beeinflusst wird. Der Ansatz des „Behavioral Accounting“ wird dabei auf die spezifische Controllingaufgabe des internen Berichtswesens angewendet. Andreas Taschner erläutert, wie Berichte bei Berücksichtigung ihrer direkten und indirekten Wirkungen auf das Verhalten einzelner Betroffener zu einem wirkungsvollen Instrument der Unternehmenssteuerung werden.
Sowohl das Erstellen als auch das Lesen von Berichten gilt als Pflichtübung, die wenig produktiv und selten spannend ist - zu Unrecht, wie dieses Buch zeigen will. Internes Berichtswesen (Management Reporting) ist nichts weniger als ein zentrales Element des Informationsmanagements und damit ein wesentlicher Erfolgsfaktor für jedes Unternehmen. Das Buch schließt eine Lücke, indem es das interne Berichtswesen von seinem Ruf als "theoriefreies Praktikerthema" befreit und für alle behandelten Themen fundierte, aber auch für Leser ohne Vorkenntnisse verständliche Theoriebezüge herstellt (Informationstheorie, Behavioural Accounting, Internationales ReWe etc). Alle Inhalte werden in vier verschiedene (und gekennzeichnete) Kategorien eingeteilt, welche einen individuellen "Lesepfad" durch das Buch ermöglichen.
Companies compete more and more as integrated supply chains rather than as individual firms. The success of the entire supply chain determines the economic well-being of the individual company. With management attention shifting to supply chains, the role of management accounting naturally must extend to the cross-company layer as well. This book demonstrates how management accounting can make a significant contribution to supply chain success.It targets students who are already familiar with the fundamentals of accounting and now want to extend their expertise in the field of cross company (or network) management accounting. Practitioners will draw valuable insights from the text as well.
In today's business landscape, companies compete more and more as integrated supply chains rather than as individual firms. The success of the entire supply chain determines the economic well-being of each company involved. With management attention shifting to supply chains, the role of management accounting naturally must extend to the cross-company layer as well. This book demonstrates how management accounting can make a significant contribution to supply chain success. It targets students who are already familiar with the fundamentals of accounting and want to extend their expertise in the field of cross-company (or network) management accounting. Practitioners will draw valuable insights from the text as well.
This second edition includes a new chapter on Digitalization and Supply Chain Accounting, as well as new opener cases to each chapter that provide real-world examples.
Eine gut funktionierende Logistik ist ein wichtiger Wettbewerbsfaktor. Um ihren Beitrag zum Unternehmenserfolg ermitteln zu können, müssen ihre Kosten aber bestimmbar sein. Daran hapert es häufig. Dabei gibt es Ansätze, um Logistikkosten von anderen Kosten abzugrenzen. Unternehmen müssen nur konkrete Regeln für ihren Einsatz berücksichtigen.
Das Weltwirtschaftswachstum der vergangenen Jahrzehnte war durch die Dynamik der Digitalisierung und Globalisierung in den Lieferketten geprägt. Die Corona-Pandemie hat die Abhängigkeit und Verletzlichkeit der Lieferketten offengelegt. Trotz einer Vielzahl verbindlicher Standards haben Unternehmen die Digitalisierung und Arbeitsteilung auch für regulatorische Arbitrage genutzt. Einerseits erhöht das die Effizienz der Wirtschaft - was mithin ökologische Ressourcen schont - andererseits werden damit internationale Standards konterkariert. Globalisierung und Digitalisierung sind Segen und Fluch zugleich.
Leveraging textual information for improving decision making in the business process lifecycle
(2015)
Business process implementations fail, because requirements are elicited incompletely. At the same time, a huge amount of unstructured data is not used for decision-making during the business process lifecycle. Data from questionnaires and interviews is collected but not exploited because the effort doing so is too high. Therefore, this paper shows how to leverage textual information for improving decision making in the business process lifecycle. To do so, text mining is used for analyzing questionnaires and interviews.
The digital transformation of our society changes the way we live, work, learn, communicate, and collaborate. The digitization of software-intensive products and services is enabled basically by four megatrends: Cloud computing, big data mobile systems, and social technologies. This disruptive change interacts with all information processes and systems that are important business enablers for the current digital transformation. The internet of things, social collaboration systems for adaptive case management, mobility systems and services for big data in cloud services environments are emerging to support intelligent user-centered and social community systems. Modern enterprises see themselves confronted with an ever growing design space to engineer business models of the future as well as their IT support, respectively. The decision analytics in this field becomes increasingly complex and decision support, particularly for the development and evolution of sustainable enterprise architectures (EA), is duly needed. With the advent of intelligent user-centered and social community systems, the challenging decision processes can be supported in more flexible and intuitive ways. Tapping into these systems and techniques, the engineers and managers of the enterprise architecture become part of a viable enterprise, i.e. a resilient and continuously evolving system that develops innovative business models.
Bei der Bayer AG wird als Lösung für das Enterprise Social Network IBM Connections eingesetzt. Bayer verfolgt das Ziel, die Mitarbeiter/innen weltweit zu vernetzen, die Kommunikation über Bereichsgrenzen hinweg zu unterstützen und um einen Wissens- und Expertenpool bereitzustellen. Im Rahmen eines Relaunches wurde 2012 Connections@Bayer, das vorher nur in Teilkonzernen verfügbar war, auf das gesamte Unternehmen ausgerollt. In einem weiteren Relaunch 2014 führte das Unternehmen ein Update auf die Version 4.5 und eine umfangreiche Kommunikationskampagne durch, die unter den Mitarbeiter/innen Aufmerksamkeit für die Kommunikationsplattform schuf und Neugier weckte. Darin wurde eine Analyse der Schlüsselvorteile der Nutzung von Connections durchgeführt, acht Kernnachrichten erarbeitet und diese auf diversen Kommunikationskanälen im Unternehmen verbreitet. Zudem ließen sich durch die Verwendung von Testimonials die Vorteile für alle Mitarbeitergruppen darstellen. Dieser Relaunch war erfolgreich: Die Nutzerzahlen konnten erweitert werden, die Mitarbeiterzufriedenheit stieg an. Die vorliegende Fallstudie stellt anschaulich dar, dass ein von einer effektiven Kommunikationskampagne begleiteter Relaunch eines Enterprise Social Networks einen nachhaltigen Erfolg herbeiführen kann.
The livestock sector is growing steadily and is responsible for around 18% of global greenhouse‐gas‐emissions, which is more than the global transport sec-tor (Steinfeld et al. 2006). This paper examines the potential of social marketing to reduce meat consumption. The aim is to understand consumers’ motivation in diet choices and to learn what opportunities social marketing can provide to counteract negative environmental and health trends. The authors believe that research to answer this question should start in metropolitan areas, be-cause measures should be especially effective there. Based on the Theory of Planned Behaviour (TPB, Ajzen 1991) and the Technology‐Acceptance‐Model by Huijts et al. (2012), an online‐study with participants from the metropolitan region (n = 708) was conducted in which central socio‐psychological constructs for a meat consumption reduction were examined. It was shown that attitude, personal norm and habit have a critical influence on the intention to reduce meat consumption. A segmentation of consumers based on these factors led to three consumer clusters: vegetarians/flexitarians, potential flexitarians and convinced meat eaters. Potential flexitarians are an especially relevant target group for the development of social‐marketing‐measures to reduce meat consumption. In co‐creation‐workshops with potential flexitarians from the metropolitan region, barriers and benefits of reducing meat consumption were identified. The factors of environmental protection, animal welfare and desire for variety turn out to be the most relevant motivational factors. Based on these factors, consumers proposed a variety of social marketing measures, such as applications and labels to inform about the environmental impact of meat products.
Dieser Artikel zeigt, dass und wie die Ideen, Werkzeuge und Lösungsansätze von Lean Management im Sales-Umfeld genutzt werden können. Es wird verdeutlicht, wie der Wertschöpfungsanteil eigener Vertriebsprozesse gesteigert und gleichzeitig die Verschwendung aus Kundensicht minimiert werden kann. Ein wesentliches Werkzeug stellt hierfür die Methode des Wertstromdesigns dar, die vom Autor und seinen Partnern speziell auf die Besonderheiten von Vertriebsprozessen adaptiert wurde. Fokus ist hierbei das Hervorheben der unterschiedlichsten Arten der Verschwendung innerhalb von Prozessen dieser Art, um so einen Lösungsfindungsprozess zu initiieren. Es wird das Potenzial dieser Methodik verdeutlicht und die Anwendung erläutert. Abschließend wird diskutiert, wie eine Kultur der Veränderung auch innerhalb von Sales-Organisationen realisiert und eine Nachhaltigkeit von Veränderungen gefördert werden kann.
This book investigates and highlights the most critical challenges the pharmaceutical industry faces in an increasingly competitive environment of inflationary R&D investments and tightening cost control pressures. The authors present three sources of pharmaceutical innovation: new management methods in the drug development pipeline; new technologies as enablers for cutting-edge R&D; and new forms of cooperation and internationalization, such as open innovation in the early phases of R&D. New models and methods are illustrated with cases from Europe, the US, and Asia. This third fully revised edition was expanded to reflect the latest updates in open and collaborative innovation, the greater strategic importance of venture capital and early stage investments, and the new range of emerging technologies now being put to use in pharmaceutical innovation.
Die digitale Transformation bezieht sich auf die zunehmende Digitalisierung von Inhalten und Prozessen und die steigende Bedeutung digitaler Medien in Wirtschaft und Gesellschaft. Dabei wird der Wandel u. a. durch die Evolution in der Nutzung des Internets getrieben. Während in der Phase des so genannten Web 1.0 die Publikation und Verbreitung statischer Inhalte im Fokus stand, werden durch das Web 2.0 überwiegend Prozesse der dezentralen Erzeugung und einfachen Verbreitung von User Generated Content stimuliert. Unternehmen müssen auf diese Veränderungen reagieren, um die eigene Wettbewerbsfähigkeit nachhaltig abzusichern. Der vorliegende Beitrag konzentriert sich auf die Weiterentwicklung des Kundenservice. Dieser wurde in den zurückliegenden Jahren von vielen Unternehmen überwiegend als Kostenfaktor mit geringer strategischer Bedeutung eingestuft. Diese Sichtweise hat sich in der digitalen Transformation grundlegend geändert. Kunden können heute Mängel an Produkten und Dienstleistungen durch Foren und Social Media Kanäle sofort und mit hoher Reichweite adressieren. Unternehmen müssen auf den gleichen Kanälen reagieren, um die Multiplikation negativer Sichtweisen einzudämmen und Übertragungseffekte auf traditionelle Medien zu vermeiden. Gleichzeitig entstehen durch digitale Kanäle völlig neue Serviceangebote, die sich nachhaltig auf die unternehmerische Wettbewerbsfähigkeit auswirken. Der vorliegende Beitrag gibt zunächst einen Überblick zu wesentlichen Entwicklungslinien der digitalen Transformation. Auf dieser Grundlage werden die Perspektiven für Unternehmen zur Integration digitaler Medien in die eigene Wertschöpfungskette skizziert. Darüber hinaus steht v. a. die Veränderung des Kundenservice im so genannten Web 2.0 zur Diskussion. Ein Ausblick auf zukünftige Entwicklungen der Digitalisierung rundet den Beitrag entsprechend ab.
Customer prioritization is a common marketing activity in business practice. It aims at an increase in average customer profitability and return on sales by treating important customers more intensively. After a short introduction highlighting the importance of customer prioritzation, the present article provides an overview of key aspects of customer prioritization. First, companies need to select a prioritization criterion, determine the method to identify important customers, and decide on how to treat these customers in a particular way. Second, companies face challenges and need to address key requirements for implementing customer prioritization within a company. Finally, the article emphasizes positive and negative consequences of customer prioritization.
Beschleunigung und Reorientierung des technischen Fortschritts überfordern selbst große Unternehmen im Spannungsfeld zwischen Spezialisierung und interdisziplinärer Konvergenz. So wird die Kombination interner Forschung und Entwicklung mit externem Wissen, vor allem in Hochtechnologien, zur zentralen Voraussetzung langfristigen Unternehmenserfolgs. In diesem Kontext untersucht die vorliegende Dissertation das Potenzial kooperativen Verhaltens zwischen Unternehmen zur Bewältigung technologischer Diskontinuitäten am Beispiel des bevorstehenden Paradigmenwechsels im automobilen Antrieb. Dabei wird Kooperation als superiore Strategie zur Stimulation des explorativen Innovationsmodus identifiziert und in eine übergreifende Dynamik der Koordinationseignung im Verlauf technologischen Fortschritts integriert. Bezogen auf den automobilen Antrieb ist eine nachhaltigkeitsinduzierte Destabilisierung des technologischen Paradigmas des Verbrennungsmotors festzustellen, während sich seine intensiven Möglichkeiten erschöpfen. Konsequenz dessen ist zunehmender Innovationsdruck, der konsistenzorientiert eine systemische Transformation von Kraftwerkstechnik und Energienetz sowie einen Paradigmenwechsel zu elektrischen Antrieben erzwingt. Aufgrund der bisher geringen technologischen Reife und hohen Kosten elektrischer Antriebssysteme zeichnet sich allerdings ein Übergang in Form einer graduellen Rekonfiguration über eine Hybridphase ab, deren Dynamik maßgeblich von der Entwicklung der technoökonomischen Schlüsselmodule Batterie und Brennstoffzelle abhängt. Die dazu erforderliche technologische Transformation birgt existenzielle Gefährdungen für die etablierten Unternehmen der Automobilindustrie, die sich gegenüber ihren Herausforderern explorationsbezogen in einer inferioren Ausgangssituation befinden. Eben hier bieten sich umfangreiche Potenziale kooperativer Exploration elektrischer Antriebe auf Verhaltens-, Innovationsprozess und Wissensebene. In Relation zu diesen erscheint das reale Kooperationsniveau jedoch als gering, volatil und, vor allem in Deutschland, übermäßig intrasektoral fokussiert.
Aus diesen Erkenntnissen ergeben sich Implikationen für Unternehmensführung, Innovationspolitik und Forschung. Managementseitig besteht die zentrale Herausforderung in der Befähigung der Organisation zu Dynamisierung von Wissen und Fähigkeiten durch simultan-heterogene Koordination explorativer und exploitativer Innovationsströme. Insbesondere die Erschließung kooperativer Potenziale setzt allerdings die Bereitschaft zur Einschränkung der eigenen Unabhängigkeit sowie zur Abweichung von bewährten Verhaltensmustern voraus. Innovationspolitisch steht die Überwindung von Beharrungskräften durch Anpassung des sozio-institutionellen Rahmens sowie die Förderung langfristiger Kooperation bei potenzialgeleiteter Intersektorialität im Vordergrund. Forschungsbezogen eröffnet speziell die Kombination von Innovations- Nachhaltigkeits- und Koordinationstheorie ein besseres Verständnis von Triebfedern und Dynamik technischen Fortschritts, das weiter vertieft werden sollte.
Zielsetzung dieses Beitrags ist es darzustellen, wie die Soziologie der Konventionen dazu beitragen kann, das Phänomen organisationaler Routinen zu verstehen. Nach einer kurzen Einführung in die aktuelle Routineforschung sowie in die EC werden zwei potentielle Antworten auf die Frage vorgestellt: Erstens, kann die EC dazu beitragen, die vorhandenen Modelle und Konzeptualisierungen von organisationalen Routinen anzureichern. So können über die EC insbesondere Rechtfertigungsprozesse im Routinehandeln erfasst werden, die bislang nicht berücksichtigt wurden. Zweitens kann die EC einen eigenständigen, d. h. genuinen Ansatz für die Beobachtung organisationaler Routinen bilden. Dabei wird der Feststellung von Brandl et al. (2014, S. 314) gefolgt, dass die partielle Übernahme einzelner Gedanken der EC und deren Integration in andere Theoriekonzepte (wie hier der organisationalen Routine) kaum dazu in der Lage ist, das volle Potential der EC zur Erklärung organisationaler Phänomene auszuschöpfen. In diesem Beitrag werden daher die wesentlichen Elemente dargelegt, aus denen ein genuin konventionenbasiertes Verständnis von organisationalen Routinen ausgearbeitet werden könnte. Der Beitrag schließt mit einer Diskussion und einem Fazit ab.