The KLU faculty, post-docs, and PhD candidates regularly publish the results of their research in scientific journals. You will find a complete overview of all KLU publications below (e.g. articles in peer-reviewed journals, professional journals, books, working papers, and conference proceedings). Search for relevant terms and keywords, or filter the list by name, year of publication or type of publication. The references include DOIs and abstracts where available, and you can download them to your own reference database or platform. We regularly update the database with new publications.
Journal Articles (Peer-Reviewed)
Holweg, Matthias and John Bicheno (2002): Supply chain simulation - a tool for education, enhancement and endeavour, International Journal of Production Economics, 78 (2): 163-175.
Holweg, Matthias and Joe Miemczyk (2002): Logistics in the “three-day car” age: Assessing the responsiveness of vehicle distribution logistics in the UK, International Journal of Physical Distribution & Logistics Management, 32 (10): 829-850.
Bertrand, J.Will M. and Jan C. Fransoo (2002): Operations management research methodologies using quantitative modeling, International Journal of Operations and Production Management, 22 (2): 241-264.
Abstract: Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in operations management. In constructing arguments, builds on learnings from operations research and operations management research from the past decades and on research from a selected number of other academic disciplines. Concludes that the methodology of quantitative model-driven empirical research offers a great opportunity for operations management researchers to further advance theory.
Flapper, Simme D.P., Jan C. Fransoo, Rob A.C.M. Broekmeulen and Karl Inderfurth (2002): Planning and control of rework in the process industries: a review, Production Planning & Control : The Management of Operations, 13 (1): 26-34.
Abstract: For all kinds of reasons, rework, i.e. the transformation of products not fulfilling preset specifications into products that do, is an important issue in process industries. Despite a considerable amount of published research on planning and control of rework, and in addition many papers referring to the existence of rework in the process industries, hardly any attention has been paid to the consequences of many process industries specific characteristics for execution, planning and control of rework operations. Here are identified the characteristics of the process industries that influence the possibilities for rework in these industries, based on a framework provided by Flapper and Jensen ( International Journal of Production Research , 1999, to be published). How the available operations management literature assists in determining operational strategies for planning and control of rework in process industries is assessed. It is concluded that many relevant and interesting problems have not been addressed.
Ivanescu, V.Cristina, Jan C. Fransoo and J.Will M. Bertrand (2002): Makespan estimation and order acceptance in batch process industries when processing times are uncertain, OR Spektrum, 24 (4): 467-495.
Abstract: Batch process industries are characterized by complex precedence relationships between operations, which renders the estimation of an acceptable workload very difficult. A detailed schedule based model can be used for this purpose, but for large problems this may require a prohibitive large amount of computation time. We propose a regression based model to estimate the makespan of a set of jobs. We extend earlier work based on deterministic processing times by considering Erlang-distributed processing times in our model. This regression-based model is used to support customer order acceptance. Three order acceptance policies are compared by means of simulation experiments: a scheduling policy, a workload policy and a regression policy. The results indicate that the performance of the regression policy can compete with the performance of the scheduling policy in situations with high variety in the job mix and high uncertainty in the processing times.
Schnabl, Gunther and Dirk G. Baur (2002): Purchasing power parity: Granger causality tests for the yen–dollar exchange rate, Japan and the World Economy, 14 (4): 425-444.
Abstract: The paper analyses the causality between the Japanese prices and the yen–dollar exchange rate. It explains the long-term appreciation trend of the Japanese yen and why the Japanese yen proved strong even during the economic slump of the 1990s. The paper suggests that the appreciation of the Japanese yen forced the Japanese enterprises into price reductions and productivity increases, which put a floor under the high level of the yen and, thus, initiated rounds of appreciation. This corresponds to the conjecture of a vicious (virtuous) circle of appreciation and price adaptation. Further, there is evidence that the yen-appreciation has been accommodated by the Bank of Japan’s monetary policy. This corresponds to the conjecture that the recent Japanese deflation is imposed from outside via the exchange rate.
Czarny, Elżbieta and Günter Lang (2002): Poland’s Accession to the EU: What do we learn from Trade Theory?, Bank i Kredyt, 33 (2): 20-30.
Albers, Sönke and Bernd Skiera (2002): Einsatzplanung eines Verkaufsaußendienstes auf der Basis einer Umsatzreaktionsfunktion, Zeitschrift für Betriebswirtschaft, 72 (11): 1105-1132.
Lang, Günter (2001): Empirical Evidence of the Assimilation Hypothesis for Eastern European Immigrants, Zeszyty Naukowe Kolegium Gospodarki Światowej, 11: 118-135.
Fransoo, Jan C., Marc J.F. Wouters and Kok, Ton G. de (2001): Multi-echelon multi-company inventory planning with limited information exchange, Journal of the Operational Research Society, 52 (7): 830-838.
Abstract: Supply chain planning concepts from multi-echelon inventory theory are generally based on some form of centralised planning of supply chains. Those multi-echelon models that do consider decentralised planning, assume complete information and/or a specific single objective function. This paper investigates how multi-echelon inventory theory can accommodate a setting with decentralised decision makers (a supplier and a number of retail groups) without complete information. We present a coordination procedure that does not require the retail groups to exchange demand information, but does allow using opportunities for demand pooling between them. We illustrate our ideas by way of a quantitative analysis of a two-echelon divergent supply chain, with both cooperative and non cooperative retail groups. We conclude that coordination across a supply chain with decentralised control and limited centralised information is feasible by using available algorithms with satisfactory service level and cost performance.
Raaymakers, Wenny H.M., J.Will M. Bertrand and Jan C. Fransoo (2001): Makespan estimation in batch process industries using aggregate resource and job set charateristics, International Journal of Production Economics, 70 (2): 145-161.
Abstract: To properly conduct aggregate control functions such as order acceptance and capacity loading, good estimates of the available production capacity need to be on hand. Capacity structures in batch process industries are generally so complex that it is not straightforward to estimate the capacity of a production department. In this paper, we assess the quality of estimation models that are based on regression. The paper builds on earlier results, which have demonstrated that a limited number of factors can explain a large share of the variance in makespan estimation based on regression models.
Rooij, de, B., Jan C. Fransoo and P.J.A. Verdaasdonk (2001): Economische seriegroottebepaling als lange termijn beslissing, Bedrijfskunde : Tijdschrift voor Modern Management, 73 (1): 84-91.
Abstract: In dit artikel wordt een methode beschreven voor het bepalen van de optimale seriegroottes, wanneer deze beslissing een langetermijnbeslissing is. Deze methode is ontwikkeld na een logistiek onderzoek in de Synthetische Fabriek van Solvay Pharmaceuticals, waar actieve stoffen van geneesmiddelen worden geproduceerd. Het overschakelen op een andere seriegrootte bleek hier een zeer ingrijpende beslissing met gevolgen voor de lange termijn. De bestaande methoden voor het bepalen van de optimale seriegroottes (zoals de formule van Camp) waren niet geschikt in deze situatie. De in dit artikel beschreven methode zal vooral zIjn toepassing kunnen vinden in de procesindustrie, zoals de farmaceutische, chemische, staal- en glasindustrie, omdat daar een seriegrootteverandering vaak een langetermijnbeslissing is.
Marks, Ulf G. and Sönke Albers (2001): Experiments in competitive product positioning: Actual behavior compared to Nash solutions, Schmalenbach Business Review, 53 (3): 150-174.
Abstract: Almost all results on competitive product positioning derived in the literature so far are based on the hypothesis that static Nash equilibria of profit-maximizing competitors are accurate predictors of final market configurations. If the positioning behavior of firms differs from this assumption, it is questionable whether the corresponding propositions can be used in optimal product positioning. In this paper, we explore the validity of the Nash reaction hypothesis. We use a newly developed marketing simulation game, PRODSTRAT, to observe decisions of 240 advanced marketing students on product position, price, and marketing budget under various market conditions. We compare the players’ final configurations to Nash equilibria under the assumption that all players attempt to maximize their profit. Our results show that pricing and budgeting decisions are very well described by Nash equilibria for fixed product positions, but that decisions on product positioning are significantly more competitive. The experiments lead to less differentiated market configurations. The result is increased pricing as well as budgeting competition, and significantly reduced profits. We develop and support the hypothesis that the more aggressive product positioning behavior observed here stems from attempts to reduce profit differences (asymmetry) relative to competitors. Since profit asymmetry occurs in many market settings, it is an important factor to consider in making product positioning decisions in a competitive environment.
Holweg, Matthias and Frits K. Pil (2001): Successful build-to-order strategies start with the customer, MIT Sloan Management Review, 43 (1): 74-83.
Sodhi, ManMohan S. (2001): Applications and Opportunities for Operations Research in Internet-Enabled Supply Chains and Electronic Marketplaces, Interfaces, 31 (2): 56-69.
Abstract: Focuses on use of operations research (OR) in internet-enabled supply chains of business enterprises. Benefits of using OR in planning, customer-relationship management, product design, and marketing; Impact of Internet growth on opportunities for OR; Improvement of supply-chain management of firms through OR.
Bicheno, John, Matthias Holweg and Jens Niessmann (2001): Constraint batch sizing in a lean environment, Supply Chain Management, 73 (1): 41-49.
Lang, Günter (2001): Global Warming and German Agriculture: Impact Estimations Using a Restricted Profit Function, Environmental and Resource Economics, 19 (2): 97-112.
Abstract: This study uses the concept of shadow prices for measuring the impacts of climate change. By estimating a restricted profit function rather than a cost or a production function the explanatory power of the model is increased because of an endogenous output structure. Using low aggregated panel data on Western German farmers, the results imply that the agricultural production process is significantly influenced by climate conditions. Simulation results using a 2 CO2 climate scenario show positive impacts for all regions in Germany. Interestingly, the spatial distribution of the gainsis indicating no advantage for those regions, which currently suffer from insufficient temperature. Finally, the importance of an endogenous output structure is confirmed by the finding that the desired product mix will drastically change.
Albers, Sönke (2000): Optimal allocation of profit across companies operating with a joint salesforce: Optimale Allokation von Deckungsbeiträgen bei Unternehmen mit einem gemeinsamen Vertrieb, OR-Spektrum, 22 (1): 19-33.
Abstract: We investigate the problem of companies that want to cooperate either by combining their salesforces or by operating a joint salesforce. Companies may have salesforces of different sizes that also differ in their effectiveness. They need an instrument to evaluate how much they gain from a cooperation, and a mechanism to allocate the profit in a fair way. In the typical case of a lack of response data we suggest to infer additional sales based on response functions for which the Dorfman-Steiner theorem is holding in the optimum. Searching for an appropriate allocation mechanism becomes difficult because typical cooperative solutions where each company pays its own salesforce but benefits from increased sales, or where commission rates on sales are paid to a joint subsidiary, may lead to asymmetric distributions of profit contribution across companies. We suggest that companies follow the Nash-solution for cooperative games which recommends that each company receive in advance the profit it would achieve in the case of non-cooperation, and that the remaining profit be shared equally.
Albers, Sönke and Manfred Krafft (2000): Regeln zur Bestimmung des fast-optimalen Angebotsaufwands, Zeitschrift für Betriebswirtschaft, 70 (10): 1083-1107.
Abstract: The selling process in business-to-business transactions is characterized by substantial resources that are invested to gather inquiries, engineer projects, and organize the cooperation of subcontractors. In this manuscript we show that common business practices of separately evaluating each project very often lead to suboptimal decisions. A literature review also reveals that current approaches offer little help to differentiate across more or less profitable and promising projects. We develop and calibrate a model that helps to determine the optimal budget levels that should be spent for gathering inquiries and for winning the contract. This is modelled by the means of a two-stage approach: The expected profit of a potential project is the product of the probability of gathering an inquiry and the expected profit of the inquiry minus the budget for gathering the inquiry (1st stage). The expected profit again is the product of the probability of finally winning the contract and the expected gross margin minus the budget for winning the contract (2nd stage). In a realistic business case, we show that one need not necessarily run complex statistical analyses to arrive at nearly optimal solutions. Rather, the structure of the optimal solution can be simplified to a heuristic that provides near-optimal values.
Raaymakers, Wenny H.M. and Jan C. Fransoo (2000): Identification of aggregate resource and job set characteristics for predicting job set makespan in batch process industries, International Journal of Production Economics, 68 (2): 137-149.
Abstract: We study multipurpose batch process industries with no-wait restrictions, overlapping processing steps, and parallel resources. To achieve high utilization and reliable lead times, the master planner needs to be able to accurately and quickly estimate the makespan of a job set. Because constructing a schedule is time consuming, and production plans may change frequently, estimates must be based on aggregate characteristics of the job set. To estimate the makespan of a complex set of jobs, we introduce the concept of job interaction. Using statistical analysis, we show that a limited number of characteristics of the job set and the available resources can explain most of the variability in the job interaction.
Raaymakers, Wenny H.M., J.Will M. Bertrand and Jan C. Fransoo (2000): The performance of workload rules for order acceptance in batch chemical manufacturing, Journal of Intelligent Manufacturing, 11 (2): 217-228.
Abstract: We investigate the performance of workload rules used to support customer order acceptance decisions in the hierarchical production control structure of a batch chemical plant. Customer order acceptance decisions need to be made at a point in time when no detailed information is available about the actual shop floor status during execution of the order. These decisions need therefore be based on aggregate models of the shop floor, which predict the feasibility of completing the customer order in time. In practice, workload rules are commonly used to estimate the availability of sufficient capacity to complete a set of orders in a given planning period. Actual observations in a batch chemical manufacturing plant show that the set of orders accepted needs to be reconsidered later, because the schedule turns out to be infeasible. Analysis of the planning processes used at the plant shows that workload rules can yields reliable results, however at the expense of a rather low capacity utilization. In practice this is often unacceptable. Since, solving a detailed scheduling problem is not feasible at this stage, this creates a dilemma that only can be solved if we can find more detailed aggregate models than workload rules can provide.
Raaymakers, Wenny H.M., J.Will M. Bertrand and Jan C. Fransoo (2000): Using aggregate estimation models for order acceptance in a decentralized production control structure for batch chemical manufacturing, IIE Transactions, 32 (10): 989-998.
Abstract: Aggregate models of detailed scheduling problems are needed to support aggregate decision making such as customer order acceptance. In this paper, we explore the performance of various aggregate models in a decentralized control setting in batch chemical manufacturing (no-wait job shops). Using simulation experiments based on data extracted from an industry application, we conclude that a linear regression based model outperforms a workload based model with regard to capacity utilization and the need for replanning at the decentralized level, specifically in situations with increased capacity utilization and/or a high variety in the job mix.
Krafft, Manfred and Sönke Albers (2000): Ansätze zur Segmentierung von Kunden - wie geeignet sind herkömmliche Konzepte?, Schmalenbachs Zeitschrift für betriebswirtschaftliche Forschung, 52 (6): 515-536.
Abstract: In markets with more or less interchangeable products, the design of buyer-seller relationships may become a critical factor to differentiate the firm from competitors. In this paper, we focus on how to optimally allocate scarce resources across customers. We show that there is need to segment customers in order to optimally design marketing and personal selling resources (e.g., budgeting, call planning). We show that an optimal allocation of resources has to consider a customer’s size (sales, marginal profit contribution) and responsiveness to marketing measures (elasticity). Against this background, we discuss all segmentation approaches in detail. It is shown that portfolios and scoring approaches are most helpful in segmenting customers in accordance with their economic value. However, these approaches lead to near-optimal solutions only if they first identify criteria which influence size and responsiveness and then optimally weight these criteria. Optimal weights are derived by determining the elasticity of size or responsiveness with respect to these criteria. These weights must be multiplied by the relative changes of the scores of the influencing criteria, summed for size and responsiveness separately, and then finally multiplied by each other.