Refine
Document Type
- Journal article (871)
- Conference proceeding (847)
- Book chapter (184)
- Book (61)
- Doctoral Thesis (34)
- Anthology (15)
- Working Paper (12)
- Patent / Standard / Guidelines (6)
- Review (6)
- Issue of a journal (2)
Language
- English (2041) (remove)
Is part of the Bibliography
- yes (2041)
Institute
- Informatik (702)
- ESB Business School (515)
- Technik (345)
- Life Sciences (327)
- Texoversum (150)
- Zentrale Einrichtungen (6)
Publisher
- Springer (299)
- IEEE (250)
- Elsevier (218)
- MDPI (98)
- Hochschule Reutlingen (55)
- Gesellschaft für Informatik (54)
- Wiley (49)
- ACM (40)
- De Gruyter (35)
- Association for Information Systems (AIS) (31)
Lithographical hotspot (LH) detection using deep learning (DL) has received much attention in the recent years. It happens mainly due to the facts the DL approach leads to a better accuracy over the traditional, state-of-the-art programming approaches. The purpose of ths study is to compare existing data augmentation (DA) techniques for the integrated circuit (IC) mask data using DL methods. DA is a method which refers to the process of creating new samples similar to the training set, thereby helping to reduce the gap between classes as well as improving the performance of the DL system. Experimental results suggest that the DA methods increase overall DL models performance for the hotspot detection tasks.
Background. The application of lean management is standard in many companies all over the world. It is used to continuously optimise existing production processes and to reduce the complexity of administrative processes. Unfortunately, in higher education, the awareness of lean management as a highly effective methodology is quite low.
Research aims. The research aim is to show how the lean strategy can be applied in university environments. Finally, this paper addresses the question why it is so difficult to implement lean in a university environment and how an institution of higher education can move forward towards becoming a lean university.
Methodology. Based on a literature review, five key lean principles are presented and examples of their implementation are discussed using short case studies from our own institution. We also compare our findings with those in the literature.
Key findings. Lean offers the chance to improve the management of higher education institutions. This requires a commitment on the part of the university top management aiming at convincing all stakeholders that a culture of lean helps the institution to be able to adapt to the rapidly changing environment of higher education.
A transaction is a demarcated sequence of application operations, for which the following properties are guaranteed by the underlying transaction processing system (TPS): atomicity, consistency, isolation, and durability (ACID). Transactions are therefore a general abstraction, provided by TPS that simplifies application development by relieving transactional applications from the burden of concurrency and failure handling. Apart from the ACID properties, a TPS must guarantee high and robust performance (high transactional throughput and low response times), high reliability (no data loss, ability to recover last consistent state, fault tolerance), and high availability (infrequent outages, short recovery times).
The architectures and workhorse algorithms of a high-performance TPS are built around the properties of the underlying hardware. The introduction of nonvolatile memories (NVM) as novel storage technology opens an entire new problem space, with the need to revise aspects such as the virtual memory hierarchy, storage management and data placement, access paths, and indexing. NVM are also referred to as storage-class memory (SCM).
Active storage
(2018)
In brief, Active Storage refers to an architectural hardware and software paradigm, based on collocation storage and compute units. Ideally, it will allow to execute application-defined data ... within the physical data storage. Thus Active Storage seeks to minimize expensive data movement, improving performance, scalability, and resource efficiency. The effective use of Active Storage mandates new architectures, algorithms, interfaces, and development toolchains.
Blockchains yield to new workloads in database management systems and K/V-stores. Distributed Ledger Technology (DLT) is a technique for managing transactions in ’trustless’ distributed systems. Yet, clients of nodes in blockchain networks are backed by ’trustworthy’ K/V-Stores, like LevelDB or RocksDB in Ethereum, which are based on Log-Structured Merge Trees (LSM Trees). However, LSM-Trees do not fully match the properties of blockchains and enterprise workloads.
In this paper, we claim that Partitioned B-Trees (PBT) fit the properties of this DLT: uniformly distributed hash keys, immutability, consensus, invalid blocks, unspent and off-chain transactions, reorganization and data state / version ordering in a distributed log-structure. PBT can locate records of newly inserted key-value pairs, as well as data of unspent transactions, in separate partitions in main memory. Once several blocks acquire consensus, PBTs evict a whole partition, which becomes immutable, to secondary storage. This behavior minimizes write amplification and enables a beneficial sequential write pattern on modern hardware. Furthermore, DLT implicate some type of log-based versioning. PBTs can serve as MV-store for data storage of logical blocks and indexing in multi-version concurrency control (MVCC) transaction processing.
Modern persistent Key/Value stores are designed to meet the demand for high transactional throughput and high data ingestion rates. Still, they rely on backwards-compatible storage stack and abstractions to ease space management, foster seamless proliferation and system integration. Their dependence on the traditional I/O stack has negative impact on performance, causes unacceptably high write-amplification, and limits the storage longevity.
In the present paper we present NoFTL KV, an approach that results in a lean I/O stack, integrating physical storage management natively in the Key/Value store. NoFTL-KV eliminates backwards compatibility, allowing the Key/Value store to directly consume the characteristics of modern storage technologies. NoFTLKV is implemented under RocksDB. The performance evaluation under LinkBench shows that NoFTL-KV improves transactional throughput by 33%, while response times improve up to 2.3x. Furthermore, NoFTL KV reduces write-amplification 19x and improves storage longevity by imately the same factor.
Background: Internationally, teledermatology has proven to be a viable alternative to conventional physical referrals. Travel cost and referral times are reduced while patient safety is preserved. Especially patients from rural areas benefit from this healthcare innovation. Despite these established facts and positive experiences from EU neighboring countries like the Netherlands or the United Kingdom, Germany has not yet implemented store-and-forward teledermatology in routine care.
Methods: The TeleDerm study will implement and evaluate store-and-forward teledermatology in 50 general practitioner (GP) practices as an alternative to conventional referrals. TeleDerm aims to confirm that the possibility of store-and-forward teledermatology in GP practices is going to lead to a 15% (n = 260) reduction in referrals in the intervention arm. The study uses a cluster-randomized controlled trial design. Randomization is planned for the cluster “county”. The main observational unit is the GP practice. Poisson distribution of referrals is assumed. The evaluation of secondary outcomes like acceptance, enablers and barriers uses a mixed methods design with questionnaires and interviews.
Discussion: Due to the heterogeneity of GP practice organization, patient management software, information technology service providers, GP personal technical affinity and training, we expect several challenges in implementing teledermatology in German GP routine care. Therefore, we plan to recruit 30% more GPs than required by the power calculation. The implementation design and accompanying evaluation is expected to deliver vital insights into the specifics of implementing telemedicine in German routine care.
We present an approach for segmenting individual cells and lamellipodia in epithelial cell clusters using fully convolutional neural networks. The method will set the basis for measuring cell cluster dynamics and expansion to improve the investigation of collective cell migration phenomena. The fully learning-based front-end avoids classical feature engineering, yet the network architecture needs to be designed carefully. Our network predicts how likely each pixel belongs to one of the classes and, thus, is able to segment the image. Besides characterizing segmentation performance, we discuss how the network will be further employed.
This work is a report on practical experiences with the issue of interoperability in German practice management systems (PMSs) from an ongoing clinical trial on teledermatology, the TeleDerm project. A proprietary and established web-platform for store-and-forward telemedicine is integrated with the IT in the GPs’ offices for automatic exchange of basic patient data. Most of the 19 different PMSs included in the study sample lack support of modern health data exchange standards, therefore the relatively old but widely available German health data exchange interface “Gerätedatentransfer” (GDT) is used. Due to the lack of enforcement and regulation of the GDT standard, several obstacles to interoperability are encountered. As a partial, but reusable working solution to cope with these issues, we present a custom middleware which is used in conjunction with GDT. We describe the design, technical implementation and observed hindrances with the existing infrastructure. A discussion on health care interfacing standards and the current state of interoperability in German PMS software is given.
Impact of phenolic resin preparation on its properties and its penetration behavior in Kraft paper
(2018)
The core of decorative laminates is generally made of stacked Kraft paper sheets impregnated with a phenolic resin. As the impregnation process in industry is relatively fast, new methods need to be developed to characterize it for different paper-resin systems. Several phenolic resins were synthesized with the same Phenol:Formaldehyde ratio of 1:1.8 and characterized by Fourier Transform Infrared Spectrometry (FTIR) as well as Size-Exclusion Chromatography (SEC). In addition, their viscosities and surface tensions when diluted in methanol to 45% of solid content were measured. The capacity of each resin to penetrate a Kraft paper sheet was characterized using a new method, which measures the conductivities induced by the liquid resin crossing the paper substrate. With this method, crossing times could be measured with a good accuracy. Surprisingly, the results showed that the penetration time of the resin samples is not correlated to the viscosity values, but rather to the surface tension characteristics and the chemical characteristics of paper. Furthermore, some resins had a higher swelling effect on the fibers that delayed the crossing of the liquid through the paper.