Ja
Refine
Year of publication
- 2017 (96) (remove)
Document Type
- Conference proceeding (48)
- Journal article (25)
- Book (14)
- Working Paper (5)
- Doctoral Thesis (2)
- Book chapter (1)
- Issue of a journal (1)
Is part of the Bibliography
- yes (96)
Institute
- Informatik (39)
- ESB Business School (32)
- Texoversum (10)
- Technik (8)
- Life Sciences (6)
Publisher
- Hochschule Reutlingen (21)
- Gesellschaft für Informatik e.V (15)
- Universität Tübingen (8)
- Elsevier (5)
- Technische Universität Berlin (3)
- Association for Information Systems (2)
- Deutsche Gesellschaft für Computer- und Roboterassistierte Chirurgie e. V. (2)
- IOP Publishing (2)
- Koordinierungsstelle (2)
- MIM, Marken-Institut München (2)
New digital technologies present both game-changing opportunities for—and existential threats to—companies whose success was built in the pre-digital economy. This article describes our findings from a study of 25 companies that were embarking on digital transformation journeys. We identified two digital strategies—customer engagement and digitized solutions—that provide direction for a digital transformation. Two technology-enabled assets are essential for executing those strategies: an operational backbone and a digital services platform. We describe how a big old company can combine these elements to navigate its digital transformation.
The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology.
Data Integration of heterogeneous data sources relies either on periodically transferring large amounts of data to a physical Data Warehouse or retrieving data from the sources on request only. The latter results in the creation of what is referred to as a virtual Data Warehouse, which is preferable when the use of the latest data is paramount. However, the downside is that it adds network traffic and suffers from performance degradation when the amount of data is high. In this paper, we propose the use of a readCheck validator to ensure the timeliness of the queried data and reduced data traffic. It is further shown that the readCheck allows transactions to update data in the data sources obeying full Atomicity, Consistency, Isolation, and Durability (ACID) properties.
The Ninth International Conference on Advances in Databases, Knowledge, and Data Applications (DBKDA 2017), held between May 21 - 25, 2017 - Barcelona, pain, continued a series of international events covering a large spectrum of topics related to advances in fundamentals on databases, evolution of relation between databases and other domains, data base technologies and content processing, as well as specifics in applications domains databases.
Advances in different technologies and domains related to databases triggered substantial improvements for content processing, information indexing, and data, process and knowledge mining. The push came from Web services, artificial intelligence, and agent technologies, as well as from the generalization of the XML adoption.
High-speed communications and computations, large storage capacities, and load-balancing for distributed databases access allow new approaches for content processing with incomplete patterns, advanced ranking algorithms and advanced indexing methods.
Evolution on e-business, ehealth and telemedicine, bioinformatics, finance and marketing, geographical positioning systems put pressure on database communities to push the ‘de facto’ methods to support new requirements in terms of scalability, privacy, performance, indexing, and heterogeneity of both content and technology.
Trotz größer werdendem Tenor, die Intuition als gewinnbringende Ergänzung zur rational geprägten Entscheidungskultur im Vertrieb zu implementieren, scheint dennoch Unsicherheit und Unwissenheit über den richtigen Umgang mit der Intuition seitens der Mitarbeiter und Mitarbeiterinnen vorzuherrschen. Durch die systematische Legitimierung der Intuition Vertrieb kann diesem Umstand entgegengewirkt werden.
Bei großen Sportereignissen wie Fußball-, Welt- und Europameisterschaften oder Olympischen Spielen geht es für Verbände und offizielle Sponsoren um Millionen, entsprechend scharf verteidigen sie ihre Werberechte. Burger King zeigt, wie sich dieses „Monopol“ kreativ umgehen lässt. Im folgenden Beitrag werden exemplarisch zwei Ambush-Marketing-Aktivitäten von Burger King im Rahmen der Fußball Europameisterschaften 2016 vorgestellt. Nicht-Sponsor Burger King setzte Ambush Marketing dabei gezielt und kreativ im Rahmen der EM ein, um gegen den offiziellen UEFA-Sponsor und Wettbewerber McDonald‘s Punkte zu sammeln.
Der B-to-B-Vertrieb hat sich durch neueste Informationstechnologien stark verändert und ist wesentlich komplexer geworden. Gleichwohl haben sich die Möglichkeiten dadurch auch dramatisch verbessert. International agierende, virtuell zusammenarbeitende Teams haben die Entscheidungsprozesse und Kompetenzen verschoben. Mit Social Selling bietet sich jetzt eine Möglichkeit, die neu entstandenen Herausforderungen im Vertrieb besser zu meistern.
The modern industrial corporation encompasses a myriad of different software applications, each of which must work in concert to deliver functionality to end-users. However, the increasingly complex and dynamic nature of competition in today’s product-markets dictates that this software portfolio be continually evolved and adapted, in order to meet new business challenges. This ability – to rapidly update, improve, remove, replace, and reimagine the software applications that underpin a firm’s competitive position – is at the heart of what has been called IT agility. Unfortunately, little work has examined the antecedents of IT agility, with respect to the choices a firm makes when designing its “Software Portfolio Architecture.”
We address this gap in the literature by exploring the relationship between software portfolio architecture and IT agility at the level of the individual applications in the architecture. In particular, we draw from modular systems theory to develop a series of hypotheses about how different types of coupling impact the ability to update, remove or replace the software applications in a firm’s portfolio. We test our hypotheses using longitudinal data from a large financial services firm, comprising over 1,000 applications and over 3,000 dependencies between them. Our methods allow us to disentangle the effects of different types and levels of coupling.
Our analysis reveals that applications with higher levels of coupling cost more to update, are harder to remove, and are harder to replace, than those with lower coupling. The measures of coupling that best explain differences in IT agility include all indirect dependencies between software applications (i.e., they include coupling and dependency relationships that are not easily visible to the system architect). Our results reveal the critical importance of software portfolio design decisions, in developing a portfolio of applications that can evolve and adapt over time.
We present a topology of MIMO arrays of inductive antennas exhibiting inherent high crosstalk cancellation capabilities. A single layer PCB is etched into a 3-channels array of emitting/receiving antennas. Once coupled with another similar 3-channels emitter/receiver, we measured an Adjacent Channel Rejection Ratio (ACRR) as high as 70 dB from 150 Hz to 150 kHz. Another primitive device made out of copper wires wound around PVC tubes to form a 2-channels “non-contact slip-ring” exhibited 22 dB to 47 dB of ACRR up to 15MHz. In this paper we introduce the underlying theoretical model behind the crosstalk suppression capabilities of those so-called “Pie-Chart antennas”: an extension of the mutual inductance compensation method to higher number of channels using symmetries. We detail the simple iterative building process of those antennas, illustrate it with numerical analysis and evaluate there effectiveness via real experiments on the 3-channels PCB array and the 2-channels rotary array up to the limit of our test setup. The Pie Chart design is primarily intended as an alternative solution to costly electronic filters or cumbersome EM shields in wireless AND wired applications, but not exclusively.