Dimka Karastoyanova is Associate Professor of Data Science and Business Intelligence at the Kühne Logistics University. Before joining the KLU in September 2016 she was a Junior Professor in Simulation Workflows at the Institute of Architecture of Application Systems and in the Cluster of Excellence Simulation Technology (SimTech) at the University of Stuttgart. She received her doctoral degree in Computer Science from TU Darmstadt and holds M. Sc. Degrees in Computational Engineering from FAU Erlangen-Nuremberg and in Industrial Engineering from the Technical University of Sofia, Bulgaria.
In addition to her research activities in Simulation Technology, she is an active member of the Service Oriented Architecture (SOA) and Business Process Management (BPM) communities. She is a member of the program committees of major conferences in these fields like BPM, ICSoC, and CoopIS, and serves as a reviewer for journals like IEEE TSC (Transactions on Services Computing), IEEE TSE (Transactions on Software Engineering), JIST (Journal Information and Software Technology), JCIS (Journal of Cooperative Information Systems), JSS (Journal of Software Systems), Springer Computing and others.
A dominating topic of her research is the development of novel approaches, models, architectures and tools for building software systems, from both theoretical and practical point of view, and for the application domains of business and eScience, while maintaining the characteristics of flexibility, interoperability, automation and ease of use.
Karastoyanova’s main long-term research interests are in the areas of data-driven process performance improvement and decision-making support for domain experts in businesses with focus on logistics and supply chain management, as well as in eScience by utilizing and advancing approaches from the broader areas of Data Science, Big Data, Data Analytics, Software Engineering, Cloud Computing, and Internet of Things.
Weiß, Andreas and Dimka Karastoyanova (2016):Enabling coupled multi-scale, multi-field experiments through choreographies of data-driven scientific simulations, Computing, 98(4): 439-467.
Abstract: Current systems for enacting scientific experiments, and simulation workflows in particular, do not support multi-scale and multi-field problems if they are not coupled on the level of the mathematical model. To address this deficiency, we present an approach enabling the trial-and-error modeling and execution of multi-scale and/or multi-field simulations in a top-down and bottom-up manner which is based on the notion of choreographies. The approach defines techniques for composing data-intensive, scientific workflows in more complex simulations in a generic, domain-independent way and thus provides means for collaborative and integrated data management using the workflow/process-based paradigm. We contribute a life cycle definition of such simulations and present in detail concepts and techniques that support all life cycle phases. Furthermore, requirements on a respective software system and choreography language supporting multi-scale and/or multi-field simulations are identified, and an architecture and its realization are presented.
Strauch, Steve, Vasilios Andrikopoulos, Dimka Karastoyanova, Frank Leymann, Nikolay Nachev and Albrecht Stäbler (2014):Migrating enterprise applications to the cloud: methodology and evaluation, International Journal of Big Data Intelligence, 1(3): 127-140.
Abstract: Migrating existing on-premise applications to the cloud is a complex and multi-dimensional task and may require adapting the applications themselves significantly. For example, when considering the migration of the database layer of an application, which provides data persistence and manipulation capabilities, it is necessary to address aspects like differences in the granularity of interactions and data confidentiality, and to enable the interaction of the application with remote data sources. In this work, we present a methodology for application migration to the cloud that takes these aspects into account. In addition, we also introduce a tool for decision support, application refactoring and data migration that assists application developers in realising this methodology. We evaluate the proposed methodology and enabling tool using a case study in collaboration with an IT enterprise.
Sonntag, Mirko and Dimka Karastoyanova (2013):Model-as-you-go: An Approach for an Advanced Infrastructure for Scientific Workflows, Journal of Grid Computing, 11(3): 553-583.
Abstract: Most of the existing scientific workflow systems rely on proprietary concepts and workflow languages. We are convinced that the conventional workflow technology that is established in business scenarios for years is also beneficial for scientists and scientific applications. We are therefore working on a scientific workflow system based on business workflow concepts and technologies. The system offers advanced flexibility features to scientists in order to support them in creating workflows in an explorative manner and to increase robustness of scientific applications. We named the approach Model-as-you-go because it enables users to model and execute workflows in an iterative process that eventually results in a complete scientific workflow. In this paper, we present main ingredients of Model-as-you-go, show how existing workflow concepts have to be extended in order to cover the requirements of scientists, discuss the application of the concepts to BPEL, and introduce the current prototype of the system.
Wetzstein, Branimir, Asli Zengin, Raman Kazhamiakin, Annapaola Marconi, Marco Pistore, Dimka Karastoyanova and Frank Leymann (2012):Preventing KPI Violations in Business Processes based on Decision Tree Learning and Proactive Runtime Adaptation, Journal of Systems Integration, 3(1): 3-18.
Abstract: The performance of business processes is measured and monitored in terms of Key Performance Indicators (KPIs). If the monitoring results show that the KPI targets are violated, the underlying reasons have to be identified and the process should be adapted accordingly to address the violations. In this paper we propose an integrated monitoring, prediction and adaptation approach for preventing KPI violations of business process instances. KPIs are monitored continuously while the process is executed. Additionally, based on KPI measurements of historical process instances we use decision tree learning to construct classification models which are then used to predict the KPI value of an instance while it is still running. If a KPI violation is predicted, we identify adaptation requirements and adaptation strategies in order to prevent the violation.
Sonntag, Mirko and Dimka Karastoyanova (2011): Compensation of Adapted Service Orchestration Logic in BPEL’n’Aspects, in: Hutchison, David, Farouk Toumani, Stefanie Rinderle-Ma, Gerhard Weikum, Moshe Y. Vardi, Doug Tygar, Demetri Terzopoulos, Madhu Sudan, Bernhard Steffen, C. Pandu Rangan, Oscar Nierstrasz, Moni Naor, John C. Mitchell, Friedemann Mattern, Jon M. Kleinberg, Josef Kittler, Takeo Kanade and Karsten Wolf (ed.): Proceedings of the 9th International Conference Business Process Management (BPM 2011), 413-428.
Abstract: BPEL’n’Aspects is a non-intrusive mechanism for adaptation of control flow of BPEL processes based on the AOP paradigm. It relies on Web service standards to weave process activities in terms of aspects into BPEL processes. This paper is a logical continuation of the BPEL’n’Aspects approach. Its main objective is to enable compensation of weaved-in Web service invocations (activities) in a straightforward manner. We present (1) requirements on a mechanism for compensation of weaved-in process activities; (2) the corresponding concepts and mechanisms to meet these requirements; (3) an example scenario to show the applicability of the approach; and (4) a prototypical implementation to prove the feasibility of the solution. This work represents an improvement in the applicability of this particular adaptation approach since processes in production need the means to compensate actions that are included into processes as result of an adaptation step, too. The concept is generic and hence can also be used by other approaches that adapt control flow.
No content available yet.
Teaching at KLU
|Since 09/2016||Associate Professor of Data Science and Business Intelligence at Kühne Logistics University, Hamburg|
|2008 - 2016|
Junior Professor in Simulation Workflows at IAAS and the Cluster of Excellence Simulation Technology, University of Stuttgart
|2006 - 2008|
Post Doc (Senior Research and Teaching Assistant) at the Institute of Architecture of Application Systems (IAAS), Computer Science Department, University of Stuttgart
|2005 - 2006|
Research and Teaching Assistant at the Institute of Architecture of Application Systems (IAAS), Computer Science Department, University of Stuttgart
|2002 - 2004|
Research assistant at the Databases and Distributed Systems Group, Computer Science Department as a PhD Student at the Graduate School “Enabling Technologies for the E‐Commerce”, Technische Universität Darmstadt Education
|2006||PhD in Computer Science, Technische Universität Darmstadt, Germany|
M.Sc. in Computational Engineering, Friedrich Alexander Universität Erlangen‐Nürnberg, Germany
M.Sc. in Industrial Engineering, Technical University of Sofia, Bulgaria
B.Sc. in Industrial Engineering, Technical University of Sofia, Bulgaria
In the KLU news
No content available yet.