covid
Buscar en
Revista Iberoamericana de Automática e Informática Industrial RIAI
Toda la web
Inicio Revista Iberoamericana de Automática e Informática Industrial RIAI Selección de variables en la predicción de llamadas en un centro de atención ...
Journal Information
Vol. 6. Issue 1.
Pages 94-104 (January 2009)
Share
Share
Download PDF
More article options
Vol. 6. Issue 1.
Pages 94-104 (January 2009)
Open Access
Selección de variables en la predicción de llamadas en un centro de atención telefónica
Visits
3560
Manuel R. Arahal
, Manuel Berenguel**, Eduardo F. Camacho*, Fernando Pavon*
* Dpto. de Ingeniería de Sistemas y Automática, Universidad de Sevilla. Camino Descubrimientos, s/n. 41092. España
** Dpto. de Lenguajes y Computación, Universidad de Almería. La Canada de San Urbano, s/n. 04120. España
This item has received

Under a Creative Commons license
Article information
Resumen

En este artículo se ilustra la importancia de la selección de variables independientes para modelos neuronales destinados a la predicción de la demanda en un centro de atención telefónica. Los modelos tienen como objetivo ayudar en la planificación semanal del personal del centro, tarea que se realiza con 14 días de antelación.

Los modelos requeridos pueden hacer uso de gran cantidad de variables independientes. Sin embargo, el número de casos que pueden ser usados para obtener los parámetros del modelo es escaso debido a los cambios socio-económicos. Esto plantea la necesidad de seleccionar cuidadosamente las variables independientes y utilizar el menor número posible de ellas, de otro modo la generalización del modelo se degradaría.

Para resolver el problema se utiliza un método mixto que permite trabajar con un alto número de variables candidatas, en una primera fase, y seleccionar más cuidadosamente un número menor de variables en una segunda fase. Los resultados obtenidos por los modelos resultantes de aplicar el método propuesto y sus variantes son analizados utilizando datos reales de un centro de atención telefónica. Los resultados de la comparación muestran que la correcta selección de variables independientes es vital para este tipo de aplicación.

Palabras clave:
Modelos
Predicción
Redes de neuronas artificiales
Full text is only aviable in PDF
Referencias
[Akaike, 1974]
H. Akaike.
A new look at the statistical model identification.
IEEE Trans. on Automatic Control, AC-19 (1974), pp. 716-723
[Andrews and Cunningham, 1995]
B. Andrews, S.M. Cunningham.
L. L. Bean improves call-center forecasting.
Interfaces, 25 (1995), pp. 1-13
[Antipov and Meade, 2002]
A. Antipov, N. Meade.
Forecasting call frequency at a financial services call centre.
Journal of Operational Research Society, 53 (2002), pp. 953-960
[Avramidis et al., 2004]
A.N. Avramidis, A. Deslauriers, P. L’Ecuyer.
Modeling daily arrivals to a telephone call center.
Management Science, 50 (2004), pp. 896-908
[Back and Cichocki, 1999]
A.D. Back, A. Cichocki.
Input variable selection using independent component analysis y higher order statistics.
First International Conference on Independent Component Analysis y Signal Separation, pp. 203-208
[Berenguel et al., 1998]
M. Berenguel, M.R. Arahal, E.F. Camacho.
Modelling the free response of a solar plant for predictive control.
Control Engineering Practice, 6 (1998), pp. 1257-1266
[Bi et al., 2003]
Jinbo Bi, Bennett Kristin, Embrechts Mark, Breneman Curt, Song Minghu.
Dimensionality reduction via sparse support vector machines.
Journal of Machine Learning Research, 3 (2003), pp. 1229-1243
[Bishop, 2006]
C.M. Bishop.
Pattern Recognition y Machine Learning.
Springer, (2006),
[Bonnlander, 1996]
Bonnlander, B. (1996). Nonparametric selection of input variables for connectionist learning.
[Buzug and Pfister, 1992]
T. Buzug, G. Pfister.
Optimal delay time y embedding dimension for delay-time coordinates by analysis of the global static y local dynamical behavior of strange attractors.
Phys. Rev. A, 45 (1992), pp. 7073-7084
[Cao et al., 1998]
L. Cao, A.I. Mees, K. Judd, G. Froyland.
Determining the minimum embedding dimensions of input-output time series data.
Internat. J. Bifur. Chaos Appl. Sci. Engrg, 8 (1998), pp. 1491-1504
[Chen et al., 1990]
S. Chen, S.A. Billings, C.F.N. Cowan, P.M. Grant.
Practical identification of narmax models using radial basis functions.
Int. J. Control, 52 (1990), pp. 1327-1350
[Diez et al., 2004]
J.L. Díez, J.L. Navarro, A. Sala.
Algoritmos de agrupamiento en la identificación de modelos borrosos.
Revista Iberoamericana de Automática e Informática Industrial, 1 (2004), pp. 32-41
[Efron and Tibshirani, 1993]
B. Efron, R.J. Tibshirani.
An Introduction to the Bootstrap.
Chapman y Hall, (1993),
[Fraser and Swinney, 1986]
A.M. Fraser, H.L. Swinney.
Independent coordinates for strange attractors from mutual information.
Phys. Rev. A, 33 (1986), pp. 1134-1140
[Gans et al., 2003]
N. Gans, G. Koole, A. Mandelbaum.
Telephone call centers: Tutorial, review y research prospects.
Manufactur-ingy Service OperationsManagement, 5 (2003), pp. 79-141
[Goutte, 1997]
Goutte, C. (1997). Lag space estimation in time series modelling.
[Jongbloed and Koole, 2001]
G. Jongbloed, G.M. Koole.
Managing uncertainty in call centers using poisson mixtures.
Applied Stochastic Models in Business y Industry, 17 (2001), pp. 307-318
[Judd and Mees, 1995]
K. Judd, A.I. Mees.
On selecting models for nonlinear time series.
Physica D, 82 (1995), pp. 426-444
[Judge et al., 1985]
G.G. Judge, W.E. Griffths, R.C. Hill, H. Lütkepohl, T.C. Lee.
The theory y practice of econometrics.
Wiley, (1985),
[Kennel et al., 1992]
M.B. Kennel, R. Brown, H.D.I. Abarbanel.
Determining embedding dimension for phase-space reconstruction using a geometrical construction.
Phys. Rev. A, 45 (1992), pp. 3403-3411
[Kohavi and John, 1997]
R. Kohavi, G.H. John.
Wrappers for feature subset selection.
Artificial Intelligence, 97 (1997), pp. 273-324
[Kohavi, 1995]
Ron Kohavi.
A study of cross-validation y bootstrap for accuracy estimation y model selection.
International Joint Conference on Artificial Intelligence, (1995), pp. 1137-1145
[Koole and Mandelbaum, 2002]
G. Koole, A. Mandelbaum.
Queueing models of call centers an introduction.
Annals of Operations Research, 113 (2002), pp. 41-59
[Kuo and Mallick, 1994]
Kuo, L. y B. Mallick (1994). Variable selection for regression models. Technical Report 94-26. Department of Statistics, University of Connecticut, EE.UU.
[LeCun et al., 1990]
Y. LeCun, J. Denker, S. Solla, R.E. Howard, L.D. Jackel.
Optimal brain damage.
Advances in Neural Information Processing Systems II, pp. 740-747
[Levin and Leen, 1993]
A.U. Levin, T.K. Leen.
Using pca to improve generalization in supervised learning.
NATO Workshop on Statistics y Neural Networks, (1993), pp. 740-747
[Ljung, 1987]
L. Ljung.
System Identification - Theory for the user.
Prentice Hall, (1987),
[Miller, 1990]
A.J. Miller.
Subset Selection in Regression.
Chapman y Hall, (1990),
[Moody, 1992]
J. Moody.
The effective number of parameters: An analysis of generalization y regularization in nonlinear learning systems.
Advances in Neural Information Processing Systems, Vol. 4, pp. 29-39
[Murata et al., 1994]
N. Murata, S. Yoshizawa, S.-I. Amari.
Network information criterion. determining the number of hidden units for an artificial neural network model.
IEEE Transactions on Neural Networks, 5 (1994), pp. 865-872
[Pi and Peterson, 1994]
H. Pi, C. Peterson.
Finding the embedding dimension y variable dependences in time series.
Neural Computation, 6 (1994), pp. 509-520
[Pinedo et al., 1999]
M. Pinedo, S. Seshadri, J.G. Shanthikumar.
Call centers in financial services: strategies, technologies, y operations.
Creating Value in Financial Services: Strategies, Operations y Technologies, pp. 357-388
[Piras and Germond, 1998]
A. Piras, A. Germond.
Local linear correlation analysis with the som.
Neurocomputing, 21 (1998), pp. 79-90
[Poncet and Moschytz, 1996]
A. Poncet, G.S. Moschytz.
Selecting inputs y measuring nonlinearity in system identification.
Neural Networks for Identification, Control, Robotics, y Signal/Image Processing. IEEE Computer Society, (1996), pp. 2-10
[Reed, 1993]
R. Reed.
Pruning algorithms-a survey.
IEEE Transactions on Neural Networks, 4 (1993), pp. 740-747
[Rhodes and Morari, 1998]
C. Rhodes, M. Morari.
Determining the model order of nonlinear input/output systems.
AIChE Journal, 44 (1998), pp. 151-163
[Rissanen, 1986]
J. Rissanen.
Stochastic complexity y modeling.
Annals of Statistics, 14 (1986), pp. 1080-1100
[Sauer et al., 1991]
T. Sauer, J.A. Yorke, M. Casdagli.
Embedology.
J. Statis. Phys, 65 (1991), pp. 579-616
[Schuurmans, 1997]
D. Schuurmans.
A new metric-based approach to model selection.
Proceedings ofthe Fourteenth National Conference on Artificial Intelligence, (1997), pp. 552-558
[Shen and Huang, 2008]
H. Shen, J.Z. Huang.
Interday forecasting y intraday updating of call center arrivals.
Manufacturing y Service Operations Management, 10 (2008), pp. 391-410
[Sze, 1984]
D.Y. Sze.
A queuing model for telephone operator staffing.
Operations Research, 32 (1984), pp. 229-249
[Valverde and Gachet, 2007]
R. Valverde, D. Gachet.
Identificación de sistemas dinámicos utilizando redes neuronales rbf.
Revista Iberoamericana de Automatica e Informática Industrial, 4 (2007), pp. 32-42
[Vapnik, 1992]
V. Vapnik.
Principles of risk minimization for learning theory.
Advances in Neural Information Processing Systems, Vol. 4, pp. 831-839
[Weiss and Kulikowski, 1991]
S.M. Weiss, C.A. Kulikowski.
Computer Systems That Learn.
Morgan Kaufmann, (1991),
[Werbos, 1988]
Paul J. Werbos.
Generalization of backpropagation with application to a recurrent gas market model.
Neural Networks, 1 (1988), pp. 339-356
[Yu et al., 2000]
D. Yu, J.B. Gomm, D. Williams.
Neural model input selection for a mimo chemical process.
Engineering Applications of Artificial Intelligence, 13 (2000), pp. 15-23
[Yuan and Fine, 1998]
J.-L. Yuan, T.L. Fine.
Neural-network design for small training sets of high dimension.
IEEE Transactions on neural networks, 9 (1998), pp. 266-280
Copyright © 2009. Elsevier España, S.L.. Todos los derechos reservados
Download PDF
Article options