Refine
Year of publication
- 2016 (57) (remove)
Document Type
- Book chapter (57) (remove)
Is part of the Bibliography
- yes (57)
Institute
- Informatik (23)
- ESB Business School (15)
- Technik (11)
- Life Sciences (6)
- Texoversum (1)
Publisher
- Springer (29)
- Gesellschaft für Informatik e.V (11)
- Wiley (6)
- Elsevier (3)
- Routledge (3)
- De Gruyter (1)
- Economica (1)
- Nova Science Publishers (1)
- Public Verlagsgesellschaft und Anzeigenagentur (1)
- Vahlen (1)
Business process management and IT supported processes are an actual topic. The procedure of finding a business process system that implements your processes the best way is not easy and takes a lot of time. In this article you will find a recommendation for an open source system. Four selected open source workflow management systems are tested and analyzed. Mean criteria for the evaluation are listed in a criteria catalogue and rated by experts by their importance. Finally, the systems are evaluated by the criteria and the best evaluated system can be recommended.
The question of why individuals adopt information technology has been present in the information systems research since the past quarter century. One of the most used models for predicting the technology usage was introduced by Fred David: The Technology Acceptance Model (TAM). It describes the influence of perceived usefulness and perceived ease of use on attitude, behavioral intention and system usage. The first two mentioned factors in turn are influenced by external variables. Although a plethora of papers exists about the TAM , an extensive analysis of the role of the external variables in the model is still missing. This paper aims to give an overview ove the most important variables. In an extensive literature review, we identified 763 relevant papers, found 552 unique single extenal variables, characterized the most important of them, and described the frequency of their appearance. Additionally, we grouped these variables into four categories (organizational characteristis, system characteristics, user personal characteristics, and other variables). Afterwards we discuss the results and show implications for theory and practice.
Bei weitgehend gleicher Ausstattung und neuen Anforderungen an die Hochschulrechenzentren rücken Kooperationen zunehmend in den Mittelpunkt. Für diese,
auch hochschulartenübergreifende, Kooperationen genügt der klassische informelle Rahmen vielfach nicht mehr. Für eine erfolgreiche Zusammenarbeit sind einige Voraussetzungen zu erfüllen. Rechenzentren treten in neuer Rolle als Provider von Dienstleistungen für Nutzer auch außerhalb ihrer eigenen Hochschule auf. Ebenso
könnten sie sich zukünftig verstärkt in der Nutzerperspektive wiederfinden. IT-Service-Einrichtungen müssen sich ihrer neuen Rolle als Diensteanbieter und Nutzer von Diensten Dritter bewusst werden und diese in ihre Überlegungen für die Ausgestaltung neuer Dienste einfließen lassen.
Marketing-Events emotionalisieren das Publikum auf besondere Art und Weise. Einstellungsänderungen bzw. Imageverbesserungen stellen deshalb die zentralen Zielsetzungen des Event-Marketing dar. Aufbauend auf den wesentlichen Grundlagen des Event-Marketing fokussiert dieses Kapitel auf das Controlling im Event-Marketing. Eingegangen wird auf die Erfolgskontrolle und die Wirkungsforschung im Event-Marketing. Es werden die Bedingungen für das Zustandekommen eines Imagetransfers von einem Event auf eine Marke bzw. ein Unternehmen erläutert und Verfahren zur Imagemessung vorgestellt.
Instead of waiting for and constantly adapting to details of political interventions, utilities need to focus on their environment from a holistic perspective. The unique position of the company - be it a local utility, a bigger player, or an international utility specializing in specitic segments - has to be the basis of goals and strategies. But without consistent translation of these goals and strategies into processes, structures, and company culture, a strategy remains pure theory. Companies need to engage in a continuing learning process. This means being willing to pass on strategies, to slow down or speed up, to work from a different angle etc.
The reduced research and development (R&D) efficiency, strong competition from generics, increased cost pressure from payers, and an increased biological complexity of new target indications have resulted in a rethinking and a change from a traditional and more closed R&D model in the pharmaceutical industry toward the new paradigm of open innovation. In the past years, pharmaceutical companies have broadened their external networks toward research collaborations with academic institutes, technology providers, or codevelopment partners. To fulfill the demand to reduce timelines and costs, research-based pharmaceutical companies started to outsource R&D activities. In addition, internal R&D processes were adjusted to the more open R&D model and new processes such as alliance management were established. The corporate frontier of pharmaceutical companies became permeable and more open. As a result, the focus of pharmaceutical R&D expanded from a purely internal toward a mixed internal and external model. Today, the U.S. pharmaceutical company Eli Lilly may have established the most open model toward external innovation, as it has integrated its innovation processes with its business model. Other companies are following this more open R&D model with newer concepts such as new frontier sciences, drug discovery alliances, private public partnerships, innovation incubators, virtual R&D, crowdsourcing, open source innovation, and innovation camps.
Clinical development is historically the phase in which a potential new medicine is being tested in phase 2 and phase 3 patient trials to demonstrate the new molecules' efficacy and safety to support the regulatory approval of drugs by health authorities. This relatively focused approach has been considerably expanded by a number of forces from within the pharmaceutical industry and equally important by changes in the healthcare systems. The need to identify the optimal patient population, showstoppers leading to discontinuation of clinical programs, the silent but constant removal of surrogate endpoints for registration, and the increased demand for real-life data which are used to demonstrate the patients' benefit and which have an ever-increasing role for pricing and reimbursement negotiations are today an integral part of this phase.
This chapter will review both the nuts and bolts of clinical development but also recent developments in this area which shape the environment and how the different players have reacted and what options might need to be explored in the future.
Virtual prototyping of integrated mixed-signal smart sensor systems requires high-performance co-simulation of analog frontend circuitry with complex digital controller hardware and embedded real-time software. We use SystemC/TLM 2.0 in conjunction with a cycle-count accurate temporal decoupling approach (TD) to simulate digital components and firmware code execution at high speed while preserving clock-cycle accuracy and, thus, real-time behavior at time quantum boundaries. Optimal time quanta ensuring real-time capability can be calculated and set automatically during simulation if the simulation engine has access to exact timing information about upcoming inter-process communication events. These methods fail in the case of non-deterministic, asynchronous events, resulting in potentially invalid simulation results. In this paper, we propose an extension to the case of asynchronous events generated by blackbox sources from which a priori event timing information is not available, such as coupled analog simulators or hardware in the loop. Additional event processing latency or rollback effort caused by temporal decoupling is minimized by calculating optimal time quanta dynamically in a SystemC model using a linear prediction scheme. We analyze the theoretical performance of the presented predictive temporal decoupling approach (PTD) by deriving a cost model that expresses the expected simulation effort in terms of key parameters such as time quantum size and CPU time per simulation cycle. For an exemplary smart-sensor system model, we show that quasi-periodic events that trigger activities in TD processes are handled accurately after the predictor has settled.
The Internet of Things (IoT) refers to the interconnectedness of physical objects, and works by equipping the latter with sensors and actuators as a means to connect to the internet. The number of connected things has increased threefold over the past five years. Consequently, firms expect the IoT to become a source of new business models driven by technology. However, only a few early adopters have started to install and use IoT appliances on a frequent basis. So it is still unclear which factors drive technological acceptance of IoT appliances. Confronting this gap in current research, the present paper explores how IoT appliances are conceptually defined, which factors drive technological acceptance of IoT appliances, and how firms can use results in order to improve value propositions in corresponding business models. lt is discovered that IoT appliance vendors need to support a broad focus as the potential buyers expose a large variety. As conclusions from this insight, the paper illustrates some flexible marketing strategies.