Guía docente de Métodos Par la Investigación en Dirección en Empresas (MA9/56/6/42)

Curso 2024/2025
Fecha de aprobación por la Comisión Académica 19/07/2024

Máster

Máster Doble: Máster Universitario en Ingeniería de Caminos, Canales y Puertos + Máster Universitario en Economía / Economics

Módulo

Asignaturas del Máster Economía/Economics

Rama

Ingeniería y Arquitectura

Centro en el que se imparte la docencia

E.T.S. de Ingeniería de Caminos, Canales y Puertos

Centro Responsable del título

International School for Postgraduate Studies

Semestre

Primero

Créditos

4

Tipo

Obligatorio

Tipo de enseñanza

Presencial

Profesorado

  • Encarnación Álvarez Verdejo
  • Juan Francisco Muñoz Rosas
  • Matilde Ruiz Arroyo

Tutorías

Encarnación Álvarez Verdejo

Email
  • Primer semestre
    • Martes 8:00 a 14:00 (Empre. Desp. C107)
    • Martes 8:00 a 14:00 (Empre. Desp. C108)
  • Segundo semestre
    • Martes 8:00 a 14:00 (Empre. Desp. C107)
    • Martes 8:00 a 14:00 (Empre. Desp. C108)

Juan Francisco Muñoz Rosas

Email
  • Primer semestre
    • Lunes 10:00 a 13:00 (Empre. Desp. C106)
  • Segundo semestre
    • Miércoles 12:30 a 14:30 (Empre. Desp. C106)
    • Miércoles 9:30 a 10:30 (Empre. Desp. C106)
    • Miercoles 12:30 a 14:30 (Empre. Desp. C106)
    • Miercoles 9:30 a 10:30 (Empre. Desp. C106)
    • Jueves 9:30 a 10:30 (Empre. Desp. C106)
    • Jueves 12:30 a 14:30 (Empre. Desp. C106)

Matilde Ruiz Arroyo

Email
  • Tutorías 1º semestre
    • Lunes 15:30 a 17:30 (Empresariales Desp. A218)
    • Martes 12:30 a 14:30 (Empresariales Desp. A218)
    • Martes 17:30 a 19:30 (Empresariales Desp. A218)
    • Miércoles 10:30 a 14:30 (Empresariales Desp. A218)
  • Tutorías 2º semestre
    • Martes 12:30 a 14:30 (Empresariales Desp. A218)
    • Jueves 10:30 a 14:30 (Empresariales Desp. A218)

Breve descripción de contenidos (Según memoria de verificación del Máster)

  • Design of surveys.
  • Samples and questionnaires in management research.
  • Scales selection and composition.
  • Assessment of measures.
  • Exploratory and confirmatory factor analysis.
  • Data analysis and model design: structural equation modeling (SEM).

Prerrequisitos y/o Recomendaciones

Not applicable.

Competencias

Competencias Básicas

  • CB6. Poseer y comprender conocimientos que aporten una base u oportunidad de ser originales en desarrollo y/o aplicación de ideas, a menudo en un contexto de investigación.
  • CB7. Que los estudiantes sepan aplicar los conocimientos adquiridos y su capacidad de resolución de problemas en entornos nuevos o poco conocidos dentro de contextos más amplios (o multidisciplinares) relacionados con su área de estudio.
  • CB8. Que los estudiantes sean capaces de integrar conocimientos y enfrentarse a la complejidad de formular juicios a partir de una información que, siendo incompleta o limitada, incluya reflexiones sobre las responsabilidades sociales y éticas vinculadas a la aplicación de sus conocimientos y juicios.
  • CB9. Que los estudiantes sepan comunicar sus conclusiones y los conocimientos y razones últimas que las sustentan a públicos especializados y no especializados de un modo claro y sin ambigüedades.
  • CB10. Que los estudiantes posean las habilidades de aprendizaje que les permitan continuar estudiando de un modo que habrá de ser en gran medida autodirigido o autónomo.

Resultados de aprendizaje (Objetivos)

  • Understand the relevance of notational analysis as a research technique in economics and management, through the use of different methods of data analysis.
  • Know some research lines within notational analysis in the management field.
  • Develop the empirical part of a research project based on quantitative analysis, by designing questionnaires, performing measures assessment, evaluating measures validity, assessing expert opinions, etc.
  • Value the importance of surveys design and validation, sampling techniques and processes of data collection and practical implementation.

Programa de contenidos Teóricos y Prácticos

Teórico

Chapter 1. Questionnaire design for different types of surveys

  1. Response process.
  2. Measuring attitudes.
  3. Testing questionnaires.
  4. Self-administered questionnaires.
  5. Survey error.
  6. Survey mode.
  7. Mixed mode surveys.

Chapter 2. Design of samples and related problems in management research

  1. Notation
  2. Some sampling designs in management research.
  3. Survey estimation strategy.
  4. The problem of missing data in management research.
    • Survey non-response: unit non-response and item non-response.
    • Consequences: bias and variance.
    • Non-response mechanisms: MCAR; MAR; Ignorable and Non-ignorable missingness.
    • Weighting.
    • Imputation: deterministic and stochastic imputation, imputation classes.

Chapter 3. Measurement validation, PLS path modeling and model evaluation.

  1. Introduction.
  2. Exploratory factor analysis.
  3. Reflective and formative constructs.
  4. Confirmatory factor analysis.
  5. Evaluation of measurement model.
  6. Building structural models.
  7. Evaluation of structural model.
  8. Moderation and mediation.

Práctico

The practical syllabus is integrated in the theory syllabus.

Bibliografía

Bibliografía fundamental

  • Cochran, W.G. (1977). Sampling Techniques. 3rd ed. New York: Wiley.
  • Field, A. (2013). Discovering Statistics Using IBM SPSS Statistics. 4th edition. SAGE Publications.
  • Hair, J.F., Hult, G.T.M., Ringle, C.M., & Sarstedt, M. (2014). A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM). SAGE Publications.
  • Hedayat, A.S., Sinha, B.K. (1991) Design and Inference in Finite Population Sampling. John Wiley and Sons.
  • Henseler, J., Dijkstra, T. K., Sarstedt, M., Ringle, C. M., Diamantopoulos, A., Straub, D. W., Ketchen, D. J., Hair, J. F., Hult, G. T. M., and Calantone, R. J. (2014). Common Beliefs and Reality about Partial Least Squares: Comments on Rönkkö & Evermann (2013), Organizational Research Methods, 17(2), 182-209.
  • Hu, M., Salvucci, S. Lee, R. (2001). A Study of Imputation Algorithms. Working Paper No. 2001–17. Washington DC: U.S. Department of Education, National Center for Education Statistics, 2001. 27 Stata Statistical Software.
  • Kalton, G., Kasprzyk, D. (1986). The treatment of missing survey data. Survey Methodology, 1-16.
  • Liñán, F., & Chen, Y. (2009). Development and cross-cultural application of a specific instrument to measure entrepreneurial intentions. Entrepreneurship Theory and Practice, 33(3), 593-617.
  • Little, R.J.A., Rubin, D.B. (2002). Statistical analysis with missing data. 2nd edition. New York: John Wiley & Sons, Inc.
  • Särndal, C.E., Swensson, B., Wretman, J.H. (1992). Model Assisted Survey Sampling. Springer- Verlag, New York.

Bibliografía complementaria

  • Aguinis, H., Edwards, J. R., & Bradley, K. J. (2016). Improving our understanding of moderation and mediation in strategic management research, Organizational Research Methods, 20(4), 665-685.
  • Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51(6), 1173-1182.
  • Babin, B. J., Lee, Y. K., Kim, E. J., & Griffin, M. (2005). Modeling consumer satisfaction and word-of-mouth: Restaurant patronage in Korea. Journal of Services Marketing, 19(3), 133-
  • Beatty, P. (1995). Understanding the Standardized/Non-Standardized Interviewing Controversy, Journal of Official Statistics, 11, 147-160.
  • Bello, A.L. (1993). Choosing among imputation techniques for incomplete multivariate data: a simulation study. Communication in Statistics, 22 823--877.
  • Berger, Y.G., Rao, J.N.K. (2006). Adjusted jackknife for imputation under unequal probability sampling without replacement. Journal of the Royal Statistical Society, Series B, 68 531--547.
  • Brick, J.M., Kalton, G. (1996). Handling missing data in survey research. Statistical Methods in Medical Research, 5 215--238.
  • Chaudhuri, A., Vos, J.W.E. (1988) Unified theory and strategies of survey sampling.} North- Holland, Amsterdam.
  • Chen, J., Shao, J. (2000). Nearest neighbor imputation for survey data. Journal of Official Statistics, 16 113--131.
  • Chin, W.W. (1998). Issues and Opinion on Structural Equation Modeling, MIS Quarterly, Vol. 22(1), 7-16.
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Mahwah, NJ: Lawrence Erlbaum.
  • Cohen, M.P. (1996). A new approach to imputation. American Statistical Association Proceedings of the Section on Survey Research Methods 293--298.
  • Costello, A.B., & Osborne, J.W. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research and Evaluation, 10(7), 1-9.
  • Couper, M. P. (2011). The future of modes of data collection. Public Opinion Quarterly, 75(5), 889-908. Available at http://poq.oxfordjournals.org/content/75/5/889.full.
  • De Leeuw, D. (2005). To mix or not to mix data collection modes in surveys. Journal of official statistics, 21(2), 233. Available at http://www.jos.nu/Articles/abstract.asp?article=212233
  • Hair, J.F., Black, W.C., Babin, B., & Anderson, R.E. (2010). Multivariate Data Analysis. 7th edition. Pearson.
  • Hansen, M.H., Hurwitz, W.N. (1943) On the theory of sampling from finite populations. Annals of Mathematical Statistics 14, 333-362.
  • Hartline, M. D., & Ferrell, O. C. (1996). The management of customer-contact service employees: An empirical investigation. Journal of Marketing, 60(4), 52-70.
  • Healy, M.J.R., Westmacott, M. (1956). Missing values in experiments analysed on automatic computers. Applied Statistics, 5 203--206.
  • Hunter, J. and DeMaio, T. (2003). Results & Recommendations from the Cognitive Pretesting of the 2003 Public School Questionnaire from the Schools and Staffing Survey (example on how a report can be written).
  • Kalton, G. (1983). Compensating for missing data. Ann Arbor: Institute for Social Research, University of Michigan.
  • Keeter, S., Kennedy, C., Dimock, M., Best, J., & Craighill, P. (2006). Gauging the impact of growing nonresponse on estimates from a national RDD telephone survey. Public Opinion Quarterly, 70(5), 759-779. Available at: https://poq.oxfordjournals.org/content/70/5/759.full
  • Kreuter, F., Presser, S., and Tourangeau, R. (2008). Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity, Public Opinion Quarterly, 72, 847-865.
  • Mukhopadhyay, P. (2000). Topics in Survey Sampling. Springer.
  • Murthy, M.N. (1967). Sampling theory and method. Calcutta: Statistical Publishing Society.
  • Oksenberg, L., Cannell, C., and Kalton, G. (1991). New Strategies for Pretesting Survey Questions, Journal of Official Statistics, 7, 349-365.
  • Rancourt, E., Lee, H., Särndal, C.E. (1994). Bias correction for survey estimates from data with ratio imputed values for confounded nonresponse. Survey Methodology, 20 137-147.
  • Rao, J.N.K. (1996). On variance estimation with imputed survey data (with discussion). Journal of the American Statistical Association, 91 499--520.
  • Rao, J.N.K., Shao, J. (1992). Jackknife Variance Estimation With Survey Data Under Hot-Deck Imputation. Biometrika, 79 811-822.
  • Rubin, D.B. (1978). Multiple imputations in sample surveys. A phenomenological Bayesian approach to nonresponse. Proceedings of the Survey Research Methods Section, American Statistical Association. 20-34.
  • Rubin, D.B. (1987) Multiple imputation for nonresponse in sample surveys. Wiley, New York.
  • Rubin, D.B. (1996). Multiple imputation after 18+ years. Journal of the American Statistical Association, 91 473-489.
  • Schnell, R. and F. Kreuter. (2005). Separating Interviewer and Sampling-Point Effects. Journal of Official Statistics, 21, 389-410.
  • Sedransk, J. (1985). The objective and practice of imputation. Proc. First Annual Res. Conf. Washington, D.C.: Bureau of the Census. 445-452.
  • Singh, S. (2003) Advanced sampling theory with applications: How Michael Selected Amy., Kluwer Academic Publishers, The Netherlands.
  • Tanur, J., and R. Tourangeau, Cognitive Aspects of Survey Methodology: Building a Bridge Between Disciplines, Washington DC: National Academy Pr.
  • Tourangeau, R. (1984). Cognitive Sciences and Survey Methods. pp. 73-100 in Jabine, T., Straf, M.
  • Tourangeau, R., Rasinski, K., Jobe, J., Smith, T.W., and Pratt, W.F. (1997). Sources of Error in a Survey on Sexual Behavior. Journal of Official Statistics, 12, 341-365.
  • Valliant, R., Dorfman, A.H., Royall, R.M. (2000) Finite population sampling and inference: A prediction approach. Wiley Series in Probability and Statistics, Survey Methodology Section. New York. John Wiley and Sons, Inc.
  • Wetzels, M., Odekerken-Schroder, G., & van Oppen, C. (2009). Using PLS path modeling for assessing hierarchical construct models: Guidelines and empirical illustration. MIS Quarterly, 33(1), 177-195.
  • Wolter, K.M. (2007) Introduction to Variance Estimation. Second Edition. Springer.

Metodología docente

Evaluación (instrumentos de evaluación, criterios de evaluación y porcentaje sobre la calificación final.)

Evaluación Ordinaria

Article 17 of the UGR Assessment Policy and Regulations establishes that the ordinary assessment system (evaluación ordinaria) will preferably be based on the continuous assessment of students, except for those who have been granted the right to a single final assessment (evaluación única final, which is an assessment method that only takes a final exam into account).

In the continuous assessment system, there will be diverse assessment tools, conducted mostly on an ongoing evaluation of the following aspects (the weight of each item of the evaluation is shown in parentheses):

  • Sessions’ attendance and active participation (15%)
    • Class attending is measured as the percentage of attended sessions within total sessions (NOTE: If applicable, attendance to seminars specifically related to the course are compulsory).
    • Active participation is measured through remarkable contributions, presentations and/or answers to quizzes during the sessions.
  • Exercises development and resolution (individually or in teams) (25%)
  • Tasks and/or projects (individually or in teams) (10%)
  • Written exam (50%). There will be an exam in 3 parts (corresponding to every chapter). It is necessary to get a minimum mark of 4 (out of 10) in every part to assess the final exam mark (weighted sum of the 3 parts). The exam is considered passed if the result of the weighted sum of the 3 parts is equal or above 5 (out of 10). 

Evaluación Extraordinaria

Article 19 of the UGR Assessment Policy and Regulations establishes that students who have not passed the course in the ordinary assessment system (evaluación ordinaria) will have access to an extraordinary assessment session (evaluación extraordinaria), in which they will have the opportunity to obtain 100% of their global mark by means of a single exam.

The assessment in this exam will comprise:

  • Exam with objective questions referred to the theoretical content (50%). It is necessary to get a minimum mark of 5 (out of 10) to pass the exam.
  • Test with several practical questions (50%). It is necessary to get a minimum mark of 5 (out of 10) to pass the test.

Evaluación única final

Article 8 of the UGR Assessment Policy and Regulations establishes that students who are unable to follow the continuous assessment system (evaluación ordinaria) due to justifiable reasons shall have recourse to a single final assessment (evaluación única final), by taking a final single exam.

In order to opt for a single final assessment (evaluación única final), students must send a request addressed to the coordinator of the master’s programme, by using the corresponding online procedure (https://sede.ugr.es/procs/Gestion-Academica-Solicitud-de-evaluacion-unica-final/), and in the first two weeks of the course (or in the two weeks following their enrolment, when the enrolment takes place after the beginning of the course' sessions).

For students authorized to do a single final assessment, the assessment will comprise:

  • Exam with objective questions referred to the theoretical content (50%). It is necessary to get a minimum mark of 5 (out of 10) to pass the exam.
  • Practical test(s), exercises and/or problems to be solved individually and handed-in to the professors (50%).

Información adicional

Inclusion and diversity: In the case of students with disabilities or other specific educational support needs (NEAE), the tutoring system will be adapted to these needs, following the recommendations of the inclusion area at the University of Granada. Departments and centers will establish appropriate measures to ensure that tutorials take place in accessible locations. Additionally, at the request of faculty, support can be requested from the competent unit at UGR for special methodological adaptations.