Refine
Year of publication
- 2016 (57) (remove)
Document Type
- Book chapter (57) (remove)
Is part of the Bibliography
- yes (57)
Institute
- Informatik (23)
- ESB Business School (15)
- Technik (11)
- Life Sciences (6)
- Texoversum (1)
Publisher
- Springer (16)
- Springer Gabler (12)
- Gesellschaft für Informatik (11)
- Wiley (6)
- Routledge (3)
- Elsevier (2)
- De Gruyter Oldenbourg (1)
- Economia (1)
- Nova Publishers (1)
- Public Verlagsgesellschaft (1)
Verlängerte Werkbank, Global Sourcing, Low-Cost-Country-Potenziale, Outsourcing: Seit Jahren herrscht eine inflationäre Verwendung dieser Schlagworte in den Vorstandsetagen. Der Markt für professionelle Dienstleistungen ist schon lange nicht mehr auf die Strategieberatungsbranche beschränkt. Mittlerweile gibt es fast keinen Unternehmensprozess in einem produzierenden Unternehmen mehr, der nicht hinsichtlich seiner Auslagerbarkeit an Berater oder Dienstleister geprüft wurde. Die Autorin beschreibt, warum professionelle Dienstleister zunehmend beauftragt werden und zeigt am Beispiel einer Service-Performance-Studie die Erfolgsfaktoren für die Zusammenarbeit auf. Anhand einer Collaboration Check List legt sie dar, was im täglichen Doing wichtig ist. Abschließend werden die Risiken und Chancen der Inanspruchnahme von Dienstleistungen aus Kunden- und Lieferantenperspektive beleuchtet.
Instead of waiting for and constantly adapting to details of political interventions, utilities need to focus on their environment from a holistic perspective. The unique position of the company - be it a local utility, a bigger player, or an international utility specializing in specitic segments - has to be the basis of goals and strategies. But without consistent translation of these goals and strategies into processes, structures, and company culture, a strategy remains pure theory. Companies need to engage in a continuing learning process. This means being willing to pass on strategies, to slow down or speed up, to work from a different angle etc.
Branding in sports
(2016)
Brands are ubiquitous in the sports business. The significance of the brand is fuelled not only by the various functions that a brand performs for providers and consumers in sports, but by the monetary value that brands have come to represent for sporting organizations. As part of the commercialization and professionalization of sports, a uniform brand presence is becoming increasingly important for sporting organizations. The implication is the need for systematic and integral brand management. This chapter initially examines the key features of sports from the marketing perspective and the most important fundamentals of sport marketing. Based on this, we will demonstrate specifically how brands in sports are established and cultivated.
Besides the optimisation of the car, energy-efficiency and safety can also be increased by optimising the driving behaviour. Based on this fact, a driving system is in development whose goal is to educate the driver in energy efficient and safe driving. It monitors the driver, the car and the environment and gives energy-efficiency and safety relevant recommendations. However, the driving system tries not to distract or bother the driver by giving recommendations for example during stressful driving situations or when the driver is not interested in that recommendation. Therefore, the driving system monitors the stress level of the driver as well as the reaction of the driver to a given recommendation and decideswhether to give a recommendation or not. This allows to suppress recommendations when needed and, thus, to increase the road safety and the user acceptance of
the driving system.
A lot of people need help in their daily life to wash, select and manage their clothing. The goal of this work is to design an assistant system (eKlarA) to support the user by giving recommendations to choose the clothing combinations, to find the clothing and to wash the clothing. The idea behind eKlarA is to generate a system that uses sensors to identify the clothing and their state in the clothing cycle. The clothing cycle consists of the stations: closets, laundry basket and washing machine in one or several places. The system uses the information about the clothing, weather and calendar to support the user in the different steps of the clothing cycle. The first prototype of this system has been developed and tested. The test results are presented in this work.
Stress is recognized as a predominant disease with raising costs for rehabilitation and treatment. Currently there are several different approaches that can be used for determining and calculating the stress levels. Usually the methods for determining stress are divided in two categories. The first category do not require any special equipment for measuring the stress. This category useless the variation in the behaviour patterns that occur while stress. The core disadvantage for the category is their limitation to specific use case. The second category uses laboratories instruments and biological sensors. This category allow to measure stress precisely and proficiently but on the same time they are not mobile and transportable and do not support real-time feedback. This work presents a mobile system that provides the calculation of stress. For achieving this, the of a mobile ECG sensor is analysed, processed and visualised over a mobile system like a smartphone. This work also explains the used stress measurement algorithm. The result of this work is a portable system that can be used with a mobile system like a smartphone as visual interface for reporting the current stress level.
Stress is becoming an important topic in modern life. The influence of stress results in a higher rate of health disorders such as burnout, heart problems, obesity, asthma, diabetes, depressions and many others. Furthermore individual’s behavior and capabilities could be directly affected leading to altered cognition, inappropriate decision making and problem solving skills. In a dynamic and unpredictable environment, such as automotive, this can result in a higher risk for accidents. Different papers faced the estimation as well as prediction of drivers’ stress level during driving. Another important question is not only the stress level of the driver himself, but also the influence on and of a group of other drivers in the near area. This paper proposes a system, which determines a group of drivers in a near area as clusters and it derives the individual stress level. This information will be analyzed to generate a stress map, which represents a graphical view about road section with a higher stress influence. Aggregated data can be used to generate navigation routes with a lower stress influence to decrease stress influenced driving as well as improve road safety.
Broad acceptance of finite-element-based analysis of structural problems and the increased availability of CAD-systems for structural tasks, which help to generate meshes of non-trivial geometries, have been setting a standard for the evaluation of designs in mechanical engineering in the last few decades. The development of automated or semi-automated optimizers, integrated into the Computer-Aided Engineering (CAE)-packages or working as outer loop machines, requiring the solver to do the analysis of the specific designs, has been accepted by most advanced users of the simulation community as well. The availability and inexpensive processing power of computers is increasing without any limitations foreseen in the coming years. There is little doubt that virtual product development will continue using the tools that have proved to be so successful and so easy to handle.
Current fields of interest
(2016)
If we review the research done in the field of optimization, the following topics appear to be the focus of current development:
– Optimization under uncertainties, taking into account the inevitable scatter of parts, external effects and internal properties. Reliability and robustness both have to be taken into account when running optimizations, so the name Robust Design Optimization (RDO) came into use.
– Multi-Objective Optimization (MOO) handles situations in which different participants in the development process are developing in different directions. Typically we think of commercial and engineering aspects, but other constellations have to be looked at as well, such as comfort and performance or price and consumption.
– Process development of the entire design process, including optimization from early stages, might help avoid inefficient efforts. Here the management of virtual development has to be re-designed to fit into a coherent scheme.
...
There are many other fields where interesting progress is being made. We limit our discussion to the first three questions.
To illustrate the power and the pitfalls of Bionic Optimization, we will show some examples spanning classes of applications and employing various strategies. These applications cover a broad range of engineering tasks. Nevertheless, there is no guarantee that our experiences and our examples will be sufficient to deal with all questions and issues in a comprehensive way. As general rule it might be stated, that for each class of problems, novices should begin with a learning phase. So, in this introductory phase, we use simple and quick examples, e.g., using small FE-models, linear load cases, short time intervals and simple material models. Here beginners within the Bionic Optimization community can learn which parameter combinations to use. In Sect. 3.3 we discuss strategies for optimization study acceleration. Making use of these parameters as starting points is one way to set the specific ranges, e.g., number of parents and kids, crossing, mutation radii and, numbers of generations. On the other hand, these trial runs will doubtless indicate that Bionic Optimization needs large numbers of individual designs, and considerable time and computing power. We recommend investing enough time preparing each task in order to avoid the frustration should large jobs fail after long calculation times.
Application to CAE systems
(2016)
Due to the broad acceptance of CAD-systems based on 3D solids, the geometric data of all common CAE (Computer-Aided Engineering) software, at least in mechanical engineering, are based on these solids. We use solid models, where the space filled by material is defined in a simple and easily useable way. Solid models allow for the development of automated meshers that transform solid volumes into finite elements. Even after some unacceptable initial trials, users are able to generate meshes of non-trivial geometries within minutes to hours, instead of days or weeks. Once meshing had no longer been the cost limiting factor of finite element studies, numerical simulation became a tool for smaller industries as well.
Due to the broad acceptance of CAD-systems based on 3D solids , the geometric data of all common CAE (Computer-Aided Engineering) software, at least in mechanical engineering, are based on these solids. We use solid models , where the space filled by material is defined in a simple and easily useable way. Solid models allow for the development of automated meshers that transform solid volumes into finite elements. Even after some unacceptable initial trials, users are able to generate meshes of non-trivial geometries within minutes to hours, instead of days or weeks. Once meshing had no longer been the cost limiting factor of finite element studies, numerical simulation became a tool for smaller industries as well.
In the early days of automated meshing development, there were discussions over the use of tetragonal (Fig. 4.1) or hexagonal based meshes. But, after a short period of time, it became evident, that there were and will always be many problems using automated meshers to generate hexagonal elements . So today nearly all automated 3D-meshing systems use tetragonal elements .
We have seen that bionic optimization can be a powerful tool when applied to problems with non-trivial landscapes of goals and restrictions. This, in turn, led us to a discussion of useful methodologies for applying this optimization to real problems. On the other hand, it must be stated that each optimization is a time consuming process as soon as the problem expands beyond a small number of free parameters related to simple parabolic responses. Bionic optimization is not a quick approach to solving complex questions within short times. In some cases it has the potential to fail entirely, either by sticking to local maxima or by random exploration of the parameter space without finding any promising solutions. The following sections present some remarks on the efficiency and limitations users must be aware of. They aim to increase the knowledge base of using and encountering bionic optimization. But they should not discourage potential users from this promising field of powerful strategies to find good or even the best possible designs.
In this chapter we introduce methods to improve mechanical designs by bionic methods. In most cases we assume that a general idea of the part or system is given by a set of data or parameters. Our task is to modify these free parameters so that a given goal or objective is optimized without violation of any of the existing restrictions.
Motivation
(2016)
Since human beings started to work consciously with their environment, they have tried to improve the world they were living in. Early use of tools, increasing quality of these tools, use of new materials, fabrication of clay pots, and heat treatment of metals: all these were early steps of optimization. But even on lower levels of life than human beings or human society, we find optimization processes. The organization of a herd of buffalos to face their enemies, the coordinated strategies of these enemies to isolate some of the herd’s members, and the organization of bird swarms on their long flights to their winter quarters: all these social interactions are optimized strategies of long learning processes, most of them the result of a kind of collective intelligence acquired during long selection periods.
This article studies the development of e-governance over time and across countries. We use a large data sample consisting of 99 developing and 34 OECD countries to study this notion. Firstly, we study the development of e-governance. Secondly, we estimate models to check the determining factors of e-governance over time and across countries. The study reveals that the level of e-governance is determined by the degree of e-participation, online access as well as GDP per capita.
Dieser Beitrag leistet einen Beitrag zur Marketingforschung, da er den jungen aber von zunehmender Relevanz geprägten Forschungsstrang zum Themenkomplex CEM grundlegend entwickelt. Zum einen zeigt das identifizierte Rahmenkonzept auf, dass CEM über einzelne unternehmerische Fähigkeiten wie dem Design von Serviceerlebnissen, das die bisherige CEM-Forschung bestimmt hat, hinausgeht. Zum anderen leistet das Konzept einen Beitrag zur Synthese fragmentierter, aber miteinander zusammenhängender Literaturströmungen in der Marketingforschung ...
The troubles began when Tom, the business analyst, asked the customer what he wants. The customer came up with good ideas for software features. Tom created a brilliant roadmap and defined the requirements for a new software product. Mary, the development team leader, was already eager to start developing and happy when she got the requirements. She and her team went ahead and created the software right away. Afterwards, Paul tested the software against the requirements. As soon as the software fulfilled the requirements, Linda, the product manager, deployed it to the customer. The customer did not like the software and ignored it. Ringo, the head of software development, was fired. How come? Nowadays, we have tremendous capabilities for creating nearly all kinds of software to fulfill the needs of customers. We can apply agile practices for reacting flexibly to changing requirements, we can use distributed development, open source, or other means for creating software at low cost, we can use cloud technologies for deploying software rapidly, and we can get enormous amounts of data showing us how customers actually use software products. However, the sad reality is that around 90% of products fail, and more than 60% of the features of a typical software product are rarely or never used. But there is a silver lining – an insight regarding successful features: Around 60% of the successes stem from a significant change of an initial idea. This gives us a hint on how to build the right software for users and customers.
Die Entwicklung dynamischer Balanced Scorecards in enger Zusammenarbeit mit Kunden zählt zum Beratungsgebiet der PA Consulting Group. Das Anwendungsbeispiel beschreibt eine dynamische Balanced Scorecard eines europäischen Automobilherstellers. Dieser verfolgt von jeher das Ziel, internationale Standards bei Technologie, Stil, Design und Leistung zu setzen. Das Unternehmen sah sich allerdings einem zunehmenden asiatischen Wettbeweb ausgesetzt, dem es mit neu entwickelten Fahrzeugen begegnen wollte. Um diese vor der Konkurrenz in den Markt einzuführen, sollten die Entwicklungsprozesse beträchtlich gestrafft werden. Zugleich sollten die Fahrzeuge zu attraktiven Preisen mit wettbewerbsfähiger und qualitativ hochwertier Ausstattung angeboten und die unternehmensweiten Profitabilitätsziele erreicht werden.
The digital transformation of our society changes the way we live, work, learn, communicate, and collaborate. The digitization of software-intensive products and services is enabled basically by four megatrends: Cloud computing, big data mobile systems, and social technologies. This disruptive change interacts with all information processes and systems that are important business enablers for the current digital transformation. The internet of things, social collaboration systems for adaptive case management, mobility systems and services for big data in cloud services environments are emerging to support intelligent user-centered and social community systems. Modern enterprises see themselves confronted with an ever growing design space to engineer business models of the future as well as their IT support, respectively. The decision analytics in this field becomes increasingly complex and decision support, particularly for the development and evolution of sustainable enterprise architectures (EA), is duly needed. With the advent of intelligent user-centered and social community systems, the challenging decision processes can be supported in more flexible and intuitive ways. Tapping into these systems and techniques, the engineers and managers of the enterprise architecture become part of a viable enterprise, i.e. a resilient and continuously evolving system that develops innovative business models.
The evolution of Services Oriented Architectures (SOA) presents many challenges due to their complex, dynamic and heterogeneous nature. We describe how SOA design principles can facilitate SOA evolvability and examine several approaches to support SOA evolution. SOA evolution approaches can be classified based on the level of granularity they address, namely, service code level, service interaction level and model level. We also discuss emerging trends, such as microservices and knowledge-based support, which can enhance the evolution of future SOA systems.
Das Ziel der vorliegenden Studie war es, den Zusammenhang zwischen der Implementierung von CRM-Prozessen und der Kundenzufriedenheit zu analysieren. Unsere Untersuchung ist einigen grundsätzlichen Beschränkunfen unterworfen. CRM ist immer noch ein relativ junges Forschungsgebiet, dessen Prozesse sich im Zeitablauf mit großer Wahrscheinlichkeit noch weiterentwickeln werden. Manche Praktiken werden als ineffektiv identifiziert und verworfen werden; andere existierende Prozesse werden eine Verbesserung erfahren. Es ist zudem zu erwarten, dass neue Prozesse und Aktivitäten entwickelt und eingeführt werden. Als Folge dieser Entwicklungen ist es möglich, dass die hier berichtete Wirkung auf die Kundenzufriedenheit durch die Implementierung von CRM-Prozessen sich im Laufe der Zeit ebenfalls ändern wird. Ein interessanter Forschungsansatz wäre daher die Beobachtung dieser Evolution im Zeitablauf.
Darüber hinaus muss in dieser Studie beachtet werden, dass die Kundenzufriedenzeit lediglich ein vorökonomisches Ziel des CRM ist. Einzelne Investitionen in eine bestehende Geschäftsbeziehung müssen anhand der Wertigkeit des Kunden für das Unvernehmen vorgenommen werden.Bestehen ferner keine Alternativen zum bisherigen Anbieter, so ist es ökonomisch nicht sinnvoll, Ressourcen zur Steigerung der Kundenzufriedenheit einzusetzen, da ein Wechsel des Anbieters unwahrscheinlich ist.
Schließlich nutzten wir für die vorliegende Studie Skalen zur Einschätzung der Einstellungen der Kunden durch die Unternehmen. Da dieses Vorgehen möglichst genaue Beurteilungen erfordert, kann es sein, dass die Daten gewisse Verzerrungen aufweisen. Zukünftige Forschungsansätze könnten die Studie durch eine ergänzende Einschätzung der Kunden zur Kreuzvalidierung sinnvoll erweitern.
Die Analyse der geometrischen Parameter der Werkzeuge und der kinematisch bedingten Eingriffsverhältnisse beim Fräsen führen zu einer erheblichen Beeinflussung der Schneidenbelastungen während des Einsatzes. Eine exzentrische Aufnahme von Schaftwerkzeugen bedeutet eine deutliche Belastung der exzentrischen Schneiden. Diese Belastung liegt deutlich über der durch die Ungleichteilung erzeugten Kraftmodulation. Weiterhin werden durch die Impulsbelastung der Schneideneintritte die Resonanzen der Struktur angeregt. Dies beeinflusst zum einen die Messungen mit der Kraftmessplattform. Zum anderen werden während der realen Bearbeitung durch diese Wechselwirkung die Oberflächen der bearbeiteten Bauteile beeinflusst.
Dieser Artikel zeigt, dass und wie die Ideen, Werkzeuge und Lösungsansätze von Lean Management im Sales-Umfeld genutzt werden können. Es wird verdeutlicht, wie der Wertschöpfungsanteil eigener Vertriebsprozesse gesteigert und gleichzeitig die Verschwendung aus Kundensicht minimiert werden kann. Ein wesentliches Werkzeug stellt hierfür die Methode des Wertstromdesigns dar, die vom Autor und seinen Partnern speziell auf die Besonderheiten von Vertriebsprozessen adaptiert wurde. Fokus ist hierbei das Hervorheben der unterschiedlichsten Arten der Verschwendung innerhalb von Prozessen dieser Art, um so einen Lösungsfindungsprozess zu initiieren. Es wird das Potenzial dieser Methodik verdeutlicht und die Anwendung erläutert. Abschließend wird diskutiert, wie eine Kultur der Veränderung auch innerhalb von Sales-Organisationen realisiert und eine Nachhaltigkeit von Veränderungen gefördert werden kann.
Different sensor types using chemical and biochemical principles are described. The former are mainly gas sensors, the latter are applied especially to liquids. Those label-free direct detection methods are compared with applications where assays take advantage of labeled receptors.
Furthermore, selected applications in the area of gas sensors are discussed, and sensors for process control, point-of-care diagnostics, environmental analytics, and food analytics are reviewed. In addition, multiplexing approaches used in microplates and microarrays are described.
On account of the huge number of sensor types and the wide range of possible applications, only the most important ones are selected here.
Der Beitrag erweitert aus der theoretischen Perspektive der Soziologie der Konventionen (Économie des Conventions, EC) die Forschung zur pragmatischen Dimension des organisationalen Gedächtnisses. Dabei wird erstens argumentiert, dass Konventionen als organisationales Gedächtnis verstanden werden können, in denen gespeichert wird, wie Koordinationsprobleme erfolgreich lösbar sind. Zweitens wird anhand des Akteurstatus der EC sowie des Konzepts der Handlungsregime diskutiert, wie Akteure auf gespeichertes Wissen zugreifen. Und drittens wird die bislang nicht berücksichtigte normative Dimension des organisationalen Gedächtnisses analysiert. Dabei wird argumentiert, dass Akteure sich auf Konventionen gestützt rechtfertigen, wenn sie Elemente des organisationalen Gedächtnisses aufgreifen. Insgesamt trägt der Beitrag dazu bei, die Verbindung von kollektivem Gedächtnis und Entscheidung besser zu verstehen, indem sie auf Basis der EC als eine interaktionistische, pragmatische und normativ geprägte Aushandlung von Erinnerungen in konkreten Situationen betrachtet wird.
Marketing-Events emotionalisieren das Publikum auf besondere Art und Weise. Einstellungsänderungen bzw. Imageverbesserungen stellen deshalb die zentralen Zielsetzungen des Event-Marketing dar. Aufbauend auf den wesentlichen Grundlagen des Event-Marketing fokussiert dieses Kapitel auf das Controlling im Event-Marketing. Eingegangen wird auf die Erfolgskontrolle und die Wirkungsforschung im Event-Marketing. Es werden die Bedingungen für das Zustandekommen eines Imagetransfers von einem Event auf eine Marke bzw. ein Unternehmen erläutert und Verfahren zur Imagemessung vorgestellt.
Pultrusion of braids
(2016)
Virtual prototyping of integrated mixed-signal smart sensor systems requires high-performance co-simulation of analog frontend circuitry with complex digital controller hardware and embedded real-time software. We use SystemC/TLM 2.0 in conjunction with a cycle-count accurate temporal decoupling approach (TD) to simulate digital components and firmware code execution at high speed while preserving clock-cycle accuracy and, thus, real-time behavior at time quantum boundaries. Optimal time quanta ensuring real-time capability can be calculated and set automatically during simulation if the simulation engine has access to exact timing information about upcoming inter-process communication events. These methods fail in the case of non-deterministic, asynchronous events, resulting in potentially invalid simulation results. In this paper, we propose an extension to the case of asynchronous events generated by blackbox sources from which a priori event timing information is not available, such as coupled analog simulators or hardware in the loop. Additional event processing latency or rollback effort caused by temporal decoupling is minimized by calculating optimal time quanta dynamically in a SystemC model using a linear prediction scheme. We analyze the theoretical performance of the presented predictive temporal decoupling approach (PTD) by deriving a cost model that expresses the expected simulation effort in terms of key parameters such as time quantum size and CPU time per simulation cycle. For an exemplary smart-sensor system model, we show that quasi-periodic events that trigger activities in TD processes are handled accurately after the predictor has settled.
Bei weitgehend gleicher Ausstattung und neuen Anforderungen an die Hochschulrechenzentren rücken Kooperationen zunehmend in den Mittelpunkt. Für diese,
auch hochschulartenübergreifende, Kooperationen genügt der klassische informelle Rahmen vielfach nicht mehr. Für eine erfolgreiche Zusammenarbeit sind einige Voraussetzungen zu erfüllen. Rechenzentren treten in neuer Rolle als Provider von Dienstleistungen für Nutzer auch außerhalb ihrer eigenen Hochschule auf. Ebenso
könnten sie sich zukünftig verstärkt in der Nutzerperspektive wiederfinden. IT-Service-Einrichtungen müssen sich ihrer neuen Rolle als Diensteanbieter und Nutzer von Diensten Dritter bewusst werden und diese in ihre Überlegungen für die Ausgestaltung neuer Dienste einfließen lassen.
Industrie 4.0 - Ausblick
(2016)
Für Unternehmen ist es wichtig, frühzeitig die strategischen Weichen für ihre Industrie 4.0-Stoßrichtung zu stellen und Erfahrung im Umgang mit Industrie 4.0-Technologien aufzubauen. Allerdings werden einige der Industrie 4.0-relevanten Technologien voraussichtlich erst in 5 bis 10 Jahren ihr Effizienzpotential voll ausschöpfen können. Die Einführung von Industrie 4.0 betrifft nahezu alle Bereiche eines Unternehmens und ist deshalb nicht nur als digitale Transformation, sondern auch als Kulturwandel in der Organisation zu verstehen, zu planen und aktiv zu managen. Themen wie Datenschutz und IT-Sicherheit sind nicht nur wichtige Voraussetzungen für eine erfolgreiche Industrie 4.0-Einführung, sondern müssen als wesentliche Akzeptanz- und Erfolgsfaktoren konsequent und durchgängig in den digitalen Systemen verankert werden.
KMUs sehen sich häufig aus finanziellen Gründen nicht in der Lage, in grundlegende Technologien der Industrie 4.0 zu investieren. So wird als Hauptvorbehalt eine vermeintlich schlechte Kosten-Nutzen-Relation bzw. langfristige Pay-Back-Zyklen angegeben. Die aktuellen Herausforderungen liegen derzeit eher bei der immer weiter voranschreitenden Internationalisierung sowie dem ansteigenden Innovationsdruck durch den Wettbewerb. Natürlich ist bekannt, dass die zunehmende Vernetzung der Produktionsanlagen in der Industrie 4.0 zudem Risiken in der IT- und Datensicherheit mit sich bringt. Auch Datenqualitäts-, Stabilitäts-, Schnittstellenprobleme oder rechtliche Probleme sind ausschlaggebend für die Verunsicherung der Unternehmen. Durch die zukünftig immer weiter ansteigende Vernetzung zwischen Unternehmen und Stakeholdern, müssen sich insbesondere Zulieferunternehmen in der Pflicht sehen, das Thema Industrie 4.0 aufzugreifen und sich damit auseinander zu setzen. Gerade diese Unternehmen müssen sich vor Augen führen, dass sie nur durch den zukünftigen Einsatz geeigneter Informations- und Kommunikationstechnologien noch in der Lage sein werden, Teil der Wertschöpfungskette zwischen ihren Kunden und Lieferanten zu sein.
Rapidly growing data volumes push today's analytical systems close to the feasible processing limit. Massive parallelism is one possible solution to reduce the computational time of analytical algorithms. However, data transfer becomes a significant bottleneck since it blocks system resources moving data-to-code. Technological advances allow to economically place compute units close to storage and perform data processing operations close to data, minimizing data transfers and increasing scalability. Hence the principle of Near Data Processing (NDP) and the shift towards code-to-data. In the present paper we claim that the development of NDP-system architectures becomes an inevitable task in the future. Analytical DBMS like HPE Vertica have multiple points of impact with major advantages which are presented within this paper.
Many organizations identified the opportunities of big data analytics to support the business with problem-specific insights through the exploitation of generated data. Socio-technical solutions are developed in big data projects to reach competitive advantage. Although these projects are aligned to specific business needs, common architectural challenges are not addressed in a comprehensive manner. Enterprise architecture management is a holistic approach to tackle the complex business and IT architecture. The transformation of an organization's EA is influenced by big data projects and their data-driven approach on all layers. To enable strategy oriented development of the EA it is essential to synchronize these projects supported by EA management. In
this paper, we conduct a systematic review of big data literature to analyze which requirements for the EA management discipline are proposed. Thereby, a broad overview about existing research is presented to facilitate a more detailed exploration and to foster the evolution o the EA management discipline.
Nowadays almost every major company has a monitoring system and produces log data to analyse their systems. To perform analysation on the log data and to extract experience for future decisions it is important to transform and synchronize different time series. For synchronizing multiple time series several methods are provided so that they are leading to a synchronized uniform time series. This is achieved by using discretisation and approximation methodics. Furthermore the discretisation through ticks is demonstrated, as well as the respectivly illustrated results.
Reality mining refers to an application of data mining, using sensor data to drive behavioral patterns in the real world. However, research in this field started a decade ago when technology was far behind today's state of the art. This paper discusses which requirements are now posed to applications in the context of reality mining. A survey has shown which sensors are available in state-of-the-art smartphones and usable to gather data for reality mining. As another contribution of this paper, a reality mining application architecture is proposed to facilitate the implementation of such applications. A proof of concept verifies the assumptions made on reality mining and the presented architecture.
Digital companies need information systems to implement their business processes end-to-end. BPM systems are promising candidates for that, because they are highly adaptable due to their business process model-driven operation mode. End-to-end processes contain different types of sub-processes that are either procedural, data-driven or business rule-based. Modern BPM systems support modeling notations for all these types of sub-processes. Moreover, end-to-end processes contain parts of shadow processing, so consequently, they must be supported in a performant way, too. BPMN seems to be the adequate notation for modeling these parts due to its procedural nature. Further, BPMN provides several elements that enable the modeling of parallel executions which are very interesting for accelerating shadow processing parts of the process. The present paper will observe the limitations and potentials of BPM systems for a high-performance execution of BPMN models representing shadow processing parts of a business process.
Business process management and IT supported processes are an actual topic. The procedure of finding a business process system that implements your processes the best way is not easy and takes a lot of time. In this article you will find a recommendation for an open source system. Four selected open source workflow management systems are tested and analyzed. Mean criteria for the evaluation are listed in a criteria catalogue and rated by experts by their importance. Finally, the systems are evaluated by the criteria and the best evaluated system can be recommended.
Converting users into customers : the role of user profile information and customer journey analysis
(2016)
Due to the digital transformation, the importance of web analysis and user profiling for enterprises is increasing rapidly as customers focus on digital channels to obtain information about products and brands. While there exists a lot research on these topics, only a minority of firms use them to their advantage. This study aims to tighten the link between research and business such that experimental methods can be used for the improvement of communication strategies in practice. Therefore, a systematic literature analysis is conducted, workshops are observed and documented and an empirical study is used to integrate single steps into a framework for the
practical usage of user profiling and customer journey analysis.
The acquisition of data for reality mining applications is a critical factor, since many mobile devices, e.g. smartphones, must be capable of capturing the required data. Otherwise, only a small target group would be able to use the reality mining application. In the course of a survey, we have identified smartphone features which might be relevant for various reality mining applications. The survey classifies these features and shows how the support of each feature has changed over the years by analyzing 143 smartphones released between 2004 and 2015. All analyzed devices can be ranked by their number of provided features. Furthermore, this paper deals with quality issues which have occurred during carrying out the survey.
The Internet of Things (IoT) refers to the interconnectedness of physical objects, and works by equipping the latter with sensors and actuators as a means to connect to the internet. The number of connected things has increased threefold over the past five years. Consequently, firms expect the IoT to become a source of new business models driven by technology. However, only a few early adopters have started to install and use IoT appliances on a frequent basis. So it is still unclear which factors drive technological acceptance of IoT appliances. Confronting this gap in current research, the present paper explores how IoT appliances are conceptually defined, which factors drive technological acceptance of IoT appliances, and how firms can use results in order to improve value propositions in corresponding business models. lt is discovered that IoT appliance vendors need to support a broad focus as the potential buyers expose a large variety. As conclusions from this insight, the paper illustrates some flexible marketing strategies.
The question of why individuals adopt information technology has been present in the information systems research since the past quarter century. One of the most used models for predicting the technology usage was introduced by Fred David: The Technology Acceptance Model (TAM). It describes the influence of perceived usefulness and perceived ease of use on attitude, behavioral intention and system usage. The first two mentioned factors in turn are influenced by external variables. Although a plethora of papers exists about the TAM , an extensive analysis of the role of the external variables in the model is still missing. This paper aims to give an overview ove the most important variables. In an extensive literature review, we identified 763 relevant papers, found 552 unique single extenal variables, characterized the most important of them, and described the frequency of their appearance. Additionally, we grouped these variables into four categories (organizational characteristis, system characteristics, user personal characteristics, and other variables). Afterwards we discuss the results and show implications for theory and practice.
Bei der Bayer AG wird als Lösung für das Enterprise Social Network IBM Connections eingesetzt. Bayer verfolgt das Ziel, die Mitarbeiter/innen weltweit zu vernetzen, die Kommunikation über Bereichsgrenzen hinweg zu unterstützen und um einen Wissens- und Expertenpool bereitzustellen. Im Rahmen eines Relaunches wurde 2012 Connections@Bayer, das vorher nur in Teilkonzernen verfügbar war, auf das gesamte Unternehmen ausgerollt. In einem weiteren Relaunch 2014 führte das Unternehmen ein Update auf die Version 4.5 und eine umfangreiche Kommunikationskampagne durch, die unter den Mitarbeiter/innen Aufmerksamkeit für die Kommunikationsplattform schuf und Neugier weckte. Darin wurde eine Analyse der Schlüsselvorteile der Nutzung von Connections durchgeführt, acht Kernnachrichten erarbeitet und diese auf diversen Kommunikationskanälen im Unternehmen verbreitet. Zudem ließen sich durch die Verwendung von Testimonials die Vorteile für alle Mitarbeitergruppen darstellen. Dieser Relaunch war erfolgreich: Die Nutzerzahlen konnten erweitert werden, die Mitarbeiterzufriedenheit stieg an. Die vorliegende Fallstudie stellt anschaulich dar, dass ein von einer effektiven Kommunikationskampagne begleiteter Relaunch eines Enterprise Social Networks einen nachhaltigen Erfolg herbeiführen kann.
Unternehmen befassen sich in jüngster Zeit verstärkt mit der Nutzung von Social Media in der internen Kommunikation und Zusammenarbeit. So genannte Enterprise Social Networks (ESN) bieten integrierte Plattformen mit Profilen, Blogs, gemeinsamer Dokumentenverwaltung, Wikis, Chats, Gruppen- und Kommentarfunktionen für die unternehmensinterne Anwendung. Sehr häufig sind damit umfangreiche Investitionen verbunden. Die Budgets werden im Kern für die IT verwendet – „weiche Faktoren“ bleiben häufig außen vor. Dies kann zu erheblichen Problemen bei der Akzeptanz entsprechender Plattformen führen. Daher sind weitere Maßnahmen im Bereich der Steuerung der Einführung und des Betriebs von ESN erforderlich, die sich unter dem Begriff der Governance zusammenfassen lassen. Das Konstrukt Governance bezieht sich auf Art und Umfang der Rollen und Aufgaben zur Steuerung der Nutzung von ESN. Der vorliegende Beitrag beleuchtet mögliche Governancemodelle für die Einführung und Weiterentwicklung von ESN. Die Resultate der vorliegenden Forschung wurden auf der Grundlage einer fundierten Literaturanalyse sowie der explorativen Befragung verantwortlicher Executives für die Nutzung von ESN in deutschen Großunternehmen erzielt. Dabei weisen die Implikationen der qualitativen Datenanalyse auf Zusammenhänge hin, die sich als Ausgangshypothesen für weitere Forschungsarbeiten nutzen lassen.
Enterprise Social Networks : Einführung in die Thematik und Ableitung relevanter Forschungsfelder
(2016)
Die Relevanz von Enterprise Social Networks (ESN) für den Arbeitsalltag in Wissensorganisationen steigt. Diese Netzwerke unterstützen die Kommunikation, Zusammenarbeit und das Wissensmanagement in Unternehmen. Der vorliegende Beitrag beinhaltet eine Einführung in das Themengebiet ESN und skizziert Einsatzmöglichkeiten, Potenziale und Herausforderungen. Er gibt einen Überblick zu wesentlichen Fachartikeln, die eine Übersicht zu Forschungsarbeiten im Bereich ESN beinhalten. Anschließend werden einzelne Forschungsbeiträge analysiert und weitere Forschungspotenziale abgeleitet. Dies führt zu acht Erfolg versprechenden Bereichen für die weitere Forschung: 1) Nutzerverhalten, 2) Effekte des Einsatzes von ESN, 3) Management, Leadership und Governance für ESN, 4) Wertbestimmung und Erfolgsmessung, 5) kulturelle Auswirkungen, 6) Architektur und Design von ESN, 7) Theorien, Forschungsdesigns und Methoden, sowie 8) weitere Herausforderungen in Bezug auf ESN. Der Beitrag charakterisiert diese Bereiche und formuliert exemplarisch offene Fragestellungen für die zukünftige Forschung.
Malgré une prise de conscience croissante de l'importance du management des marques au niveau du club, on note un important retard en ce qui concerne la gestion professionnelle de la marque au sein des clubs de la Bundesliga. Jusqu'à présent, les principes d'une gestion de marque identitaire ont été rarement appliqués et la plupart des clubs ont renoncé, en dépit d'un potentiel économique élevé, à la possibilité de se créer des avantages concurrentiels sur le plan économique, mais aussi au niveau sportif. Dans ce chapitre nous étudierons les facteurs de réussite de la gestion de marque identitaire des clubs de football professionnel à partir du cas concret de Borussia Dortmund.
Despite 30 years of Electronic Design Automation, analog IC layouts are still handcrafted in a laborious fashion today due to the complex challenge of considering all relevant design constraints. This paper presents Self-organized Wiring and Arrangement of Responsive Modules (SWARM), a novel approach addressing the problem with a multi-agent system: autonomous layout modules interact with each other to evoke the emergence of overall compact arrangements that fit within a given layout zone. SWARM´s unique advantage over conventional optimization-based and procedural approaches is its ability to consider crucial design constraints both explicitly and implicitly. Several given examples show that by inducing a synergistic flow of self-organization, remarkable layout results can emerge from SWARM’s decentralized decision-making model.
Dieser Beitrag beschreibt das Markenmanagement von Profifußballvereinen durch den Einsatz von Social Media. Um sich ein stückweit vom nichtplanbaren sportlichen Erfolg unabhängig zu machen, sollten sich Fußballvereine als Marke positionieren. Dazu steht ihnen allerdings traditionellerweise ein geringes Marketingbudget zur Verfügung. Social Media bietet Fußballvereinen die Möglichkeit, relativ kostengünstig und effektiv die eigene Marke aufzubauen und zu pflegen. Der Beitrag erläutert diesbezüglich die Notwendigkeit eines systematischen Markenmanagements, geht auf die Besonderheiten der Vermarktung eines Profifußballvereins ein und zeigt anhand von Beispielen, wie Social Media zum Markenaufbau respektive zur Markenpflege genutzt werden kann.