The KLU faculty, post-docs, and PhD candidates regularly publish the results of their research in scientific journals. You will find a complete overview of all KLU publications below (e.g. articles in peer-reviewed journals, professional journals, books, working papers, and conference proceedings). Search for relevant terms and keywords, or filter the list by name, year of publication or type of publication. The references include DOIs and abstracts where available, and you can download them to your own reference database or platform. We regularly update the database with new publications.

Journal Articles (Peer-Reviewed)

Copy reference link   DOI: 10.1016/S0925-5273%2800%2900052-9

Abstract: To properly conduct aggregate control functions such as order acceptance and capacity loading, good estimates of the available production capacity need to be on hand. Capacity structures in batch process industries are generally so complex that it is not straightforward to estimate the capacity of a production department. In this paper, we assess the quality of estimation models that are based on regression. The paper builds on earlier results, which have demonstrated that a limited number of factors can explain a large share of the variance in makespan estimation based on regression models.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1287/inte.31.2.56.10633

Abstract: Focuses on use of operations research (OR) in internet-enabled supply chains of business enterprises. Benefits of using OR in planning, customer-relationship management, product design, and marketing; Impact of Internet growth on opportunities for OR; Improvement of supply-chain management of firms through OR.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1007/BF03396633

Abstract: Almost all results on competitive product positioning derived in the literature so far are based on the hypothesis that static Nash equilibria of profit-maximizing competitors are accurate predictors of final market configurations. If the positioning behavior of firms differs from this assumption, it is questionable whether the corresponding propositions can be used in optimal product positioning. In this paper, we explore the validity of the Nash reaction hypothesis. We use a newly developed marketing simulation game, PRODSTRAT, to observe decisions of 240 advanced marketing students on product position, price, and marketing budget under various market conditions. We compare the players’ final configurations to Nash equilibria under the assumption that all players attempt to maximize their profit. Our results show that pricing and budgeting decisions are very well described by Nash equilibria for fixed product positions, but that decisions on product positioning are significantly more competitive. The experiments lead to less differentiated market configurations. The result is increased pricing as well as budgeting competition, and significantly reduced profits. We develop and support the hypothesis that the more aggressive product positioning behavior observed here stems from attempts to reduce profit differences (asymmetry) relative to competitors. Since profit asymmetry occurs in many market settings, it is an important factor to consider in making product positioning decisions in a competitive environment.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1057/palgrave.jors.2601162

Abstract: Supply chain planning concepts from multi-echelon inventory theory are generally based on some form of centralised planning of supply chains. Those multi-echelon models that do consider decentralised planning, assume complete information and/or a specific single objective function. This paper investigates how multi-echelon inventory theory can accommodate a setting with decentralised decision makers (a supplier and a number of retail groups) without complete information. We present a coordination procedure that does not require the retail groups to exchange demand information, but does allow using opportunities for demand pooling between them. We illustrate our ideas by way of a quantitative analysis of a two-echelon divergent supply chain, with both cooperative and non cooperative retail groups. We conclude that coordination across a supply chain with decentralised control and limited centralised information is feasible by using available algorithms with satisfactory service level and cost performance.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link

Abstract: In dit artikel wordt een methode beschreven voor het bepalen van de optimale seriegroottes, wanneer deze beslissing een langetermijnbeslissing is. Deze methode is ontwikkeld na een logistiek onderzoek in de Synthetische Fabriek van Solvay Pharmaceuticals, waar actieve stoffen van geneesmiddelen worden geproduceerd. Het overschakelen op een andere seriegrootte bleek hier een zeer ingrijpende beslissing met gevolgen voor de lange termijn. De bestaande methoden voor het bepalen van de optimale seriegroottes (zoals de formule van Camp) waren niet geschikt in deze situatie. De in dit artikel beschreven methode zal vooral zIjn toepassing kunnen vinden in de procesindustrie, zoals de farmaceutische, chemische, staal- en glasindustrie, omdat daar een seriegrootteverandering vaak een langetermijnbeslissing is.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1023/A:1011178931639

Abstract: This study uses the concept of shadow prices for measuring the impacts of climate change. By estimating a restricted profit function rather than a cost or a production function the explanatory power of the model is increased because of an endogenous output structure. Using low aggregated panel data on Western German farmers, the results imply that the agricultural production process is significantly influenced by climate conditions. Simulation results using a 2 CO2 climate scenario show positive impacts for all regions in Germany. Interestingly, the spatial distribution of the gainsis indicating no advantage for those regions, which currently suffer from insufficient temperature. Finally, the importance of an endogenous output structure is confirmed by the finding that the desired product mix will drastically change.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1007/s002910050003

Abstract: We investigate the problem of companies that want to cooperate either by combining their salesforces or by operating a joint salesforce. Companies may have salesforces of different sizes that also differ in their effectiveness. They need an instrument to evaluate how much they gain from a cooperation, and a mechanism to allocate the profit in a fair way. In the typical case of a lack of response data we suggest to infer additional sales based on response functions for which the Dorfman-Steiner theorem is holding in the optimum. Searching for an appropriate allocation mechanism becomes difficult because typical cooperative solutions where each company pays its own salesforce but benefits from increased sales, or where commission rates on sales are paid to a joint subsidiary, may lead to asymmetric distributions of profit contribution across companies. We suggest that companies follow the Nash-solution for cooperative games which recommends that each company receive in advance the profit it would achieve in the case of non-cooperation, and that the remaining profit be shared equally.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1108/13598540010319993

Abstract: Increased demand variability in supply chains (the bullwhip effect) has been discussed in the literature. The practical measurement of this effect, however, entails some problems that have not received much attention in the literature and that have to do with the aggregation of data, incompleteness of data, the isolation of demand data for defined supply chains that are part of a greater supply web. This paper discusses these conceptual measurement problems and discusses experiences in dealing with some of these problems in an industrial project. Also presents empirical results of measurements of the bullwhip effect in two supply chains.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1016/S0965-8564%2898%2900066-4

Abstract: The Netherlands Railways operates a double tracked intensively used network of railroads. To expand the transportation capacity, each year a number of infrastructure expansions are considered. The evaluation of these expansions is traditionally done by establishing a set of detailed timetables that serve the forecasted transportation demand and that can be executed with the proposed infrastructure expansion. However, the development of a detailed timetable is a very time consuming process, and therefore leaves little opportunity for comparing many alternatives. In this paper, we present and test an aggregate model that can be used to single out the most promising investment alternatives in the railroad infrastructure, specifically passing constructions. The aggregate model provides the user with insight into the ranking of the various alternatives and additionally gives a relative insight into the theoretical capacity of the proposed infrastructure change.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1023/A%3A1007672800764

Abstract: Aggregate models of detailed scheduling problems are needed to support aggregate decision making such as customer order acceptance. In this paper, we explore the performance of various aggregate models in a decentralized control setting in batch chemical manufacturing (no-wait job shops). Using simulation experiments based on data extracted from an industry application, we conclude that a linear regression based model outperforms a workload based model with regard to capacity utilization and the need for replanning at the decentralized level, specifically in situations with increased capacity utilization and/or a high variety in the job mix.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link

Abstract: The selling process in business-to-business transactions is characterized by substantial resources that are invested to gather inquiries, engineer projects, and organize the cooperation of subcontractors. In this manuscript we show that common business practices of separately evaluating each project very often lead to suboptimal decisions. A literature review also reveals that current approaches offer little help to differentiate across more or less profitable and promising projects. We develop and calibrate a model that helps to determine the optimal budget levels that should be spent for gathering inquiries and for winning the contract. This is modelled by the means of a two-stage approach: The expected profit of a potential project is the product of the probability of gathering an inquiry and the expected profit of the inquiry minus the budget for gathering the inquiry (1st stage). The expected profit again is the product of the probability of finally winning the contract and the expected gross margin minus the budget for winning the contract (2nd stage). In a realistic business case, we show that one need not necessarily run complex statistical analyses to arrive at nearly optimal solutions. Rather, the structure of the optimal solution can be simplified to a heuristic that provides near-optimal values.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1007/BF03372627

Abstract: In markets with more or less interchangeable products, the design of buyer-seller relationships may become a critical factor to differentiate the firm from competitors. In this paper, we focus on how to optimally allocate scarce resources across customers. We show that there is need to segment customers in order to optimally design marketing and personal selling resources (e.g., budgeting, call planning). We show that an optimal allocation of resources has to consider a customer’s size (sales, marginal profit contribution) and responsiveness to marketing measures (elasticity). Against this background, we discuss all segmentation approaches in detail. It is shown that portfolios and scoring approaches are most helpful in segmenting customers in accordance with their economic value. However, these approaches lead to near-optimal solutions only if they first identify criteria which influence size and responsiveness and then optimally weight these criteria. Optimal weights are derived by determining the elasticity of size or responsiveness with respect to these criteria. These weights must be multiplied by the relative changes of the scores of the influencing criteria, summed for size and responsiveness separately, and then finally multiplied by each other.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1023/A%3A1008999002145

Abstract: We investigate the performance of workload rules used to support customer order acceptance decisions in the hierarchical production control structure of a batch chemical plant. Customer order acceptance decisions need to be made at a point in time when no detailed information is available about the actual shop floor status during execution of the order. These decisions need therefore be based on aggregate models of the shop floor, which predict the feasibility of completing the customer order in time. In practice, workload rules are commonly used to estimate the availability of sufficient capacity to complete a set of orders in a given planning period. Actual observations in a batch chemical manufacturing plant show that the set of orders accepted needs to be reconsidered later, because the schedule turns out to be infeasible. Analysis of the planning processes used at the plant shows that workload rules can yields reliable results, however at the expense of a rather low capacity utilization. In practice this is often unacceptable. Since, solving a detailed scheduling problem is not feasible at this stage, this creates a dilemma that only can be solved if we can find more detailed aggregate models than workload rules can provide.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1016/S0925-5273(99)00111-5

Abstract: We study multipurpose batch process industries with no-wait restrictions, overlapping processing steps, and parallel resources. To achieve high utilization and reliable lead times, the master planner needs to be able to accurately and quickly estimate the makespan of a job set. Because constructing a schedule is time consuming, and production plans may change frequently, estimates must be based on aggregate characteristics of the job set. To estimate the makespan of a complex set of jobs, we introduce the concept of job interaction. Using statistical analysis, we show that a limited number of characteristics of the job set and the available resources can explain most of the variability in the job interaction.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1111/j.1937-5956.2000.tb00326.x

Abstract: In this article, we describe the Global Project Coordination Course, a course in which project teams composed of three students from each of two overseas universities execute company-sponsored projects dealing with global supply chain management issues. The 75,000 to 100,00 contributed in total by the three to four sponsoring companies funds all course expenses. We assess the benefits and challenges of the use of cross-cultural project teams with diverse educational backgrounds. We conclude that the course provides a unique and effective vehicle for furthering students’ knowledge of Supply Chain Management and Information Systems, improving understanding of "soft" issues, and training students to work in diverse, global, cross-cultural project teams.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1007/BF03371618

Abstract: The German market for Digital Television has not reached the targeted numbers of adopters yet. Although the diffusion of DFI or Premiere Digital is marginal so far, significant discussions in the community about the danger of monopolistic structures have taken place, with main focus on the KirchGroup as one of the pioneers and main content-provider. In contrast, we show that this discussion is rather short-term oriented. The existing Digital Television is merely an important technological step towards Interactive Television, which provides a much broader variety of services, and will consist of a different set of players. As such the arguments against the KirchGroup are not relevant in the long run. Our main hypothesis bases on the need to provide more contents as Digital Television turns interactive. Therefore more content-providers will gain access to the system or otherwise the internet will become the dominant alternative for the users.

Export record: Citavi Endnote RIS ISI BibTeX WordXML

Copy reference link   DOI: 10.1023/A:1008130918565

Abstract: Based on an unbalanced panel of all Bavarian cooperative banks for the years of 1989--97, which includes information on 283 mergers, we analyze motives and cost effects of small-scale mergers in German banking. Estimating a frontier cost function with a time-variable stochastic efficiency term, we show that positive scale and scope effects from a merger arise only if the merged unit closes part of the former branch network. When we compare actual mergers to a simulation of hypothetical mergers, size effects of observed mergers turn out to be slightly more favorable than for all possible mergers. Banks taken over by others are less efficient than the average bank in the same size class, but exhibit, on average, the same efficiency as the acquiring firms. For the post-merger phase, our empirical results provide no evidence for efficiency gains from merging, but point instead to a leveling off of differences among the merging units.

Export record: Citavi Endnote RIS ISI BibTeX WordXML