Desktop version

Home arrow Computer Science

  • Increase font
  • Decrease font


<<   CONTENTS

Observations and Future Research Possibilities

Software applications continue to emerge with increasing complexity and frequent modifications. Accordingly, the importance of evaluating the impact of such modifications on available artefacts is increasing. Hence, Change Impact Analysis has been the focus of the study of most researchers in the software re-engineering field. Consequently, the volume of research works proposed on Change Impact Analysis in contemporary literature is vast. Nevertheless, not many works can serve as initial support for future works on Change Impact Analysis.

The chapter framed a set of research queries and possible options for the future scope of the work, which are mentioned below. Initially, the research study identified the limited availability of empirical evaluation of suggested concepts. In addition, no techniques to serve the entire program development procedure were identified as this needs strong coupling among multiple phases of the process. In addition, most researchers suggested categorization concepts for types of modification and dependency that can affect Change Impact Analysis. As a part of the future scope of Change Impact Analysis research, a systematic investigation needs to be performed considering the pros and cons of the available studies.

Future Study Scope

• Limited availability of Change Impact Analysis for heterogeneous artefacts and multi-perspective projects. As discussed in previous chapters, most of the contemporary literature operates in single-perspective projects, and over 65% of the studies operate on the basis of the source code [77]. A few Change Impact Analysis studies are designed to handle multiple artefacts. Nevertheless, these studies have several shortcomings and restrict their usage to single-perspective projects.

  • • This book reviewed the available literature in the field and observed inconsistencies among different Change Impact Analysis techniques proposed for modelling modification functions. None of the studies could be considered as standard techniques for addressing the Change Impact Analysis operations. As any Change Impact Analysis requires accuracy in the classification of modification functions and then in the next phase requires successful modelling of these operations, a standard evaluation of multiple modification functions is necessary.
  • • Further, dependency relations resulting in the proliferation of modifications across artefacts also face similar challenges. Dependency approaches suggested in the observed research works are inadequately classified and are not clearly specified. In addition, the classified categories are often not comprehensive and contradict with other categories. Accordingly, much detailed observation of these relations is necessary to ensure acceptable Change Impact Analysis quality.
  • • The result of this study focusing on Change Impact Analysis, which is crucial objective of this book. Only a small number of techniques are observed to depict programmers of the process and the reason for a given artefact being affected by the suggested modification and the steps to be taken to handle the change. Irrespective of the final use of outcomes of Change Impact Analysis, a clear understanding of the process and the reason for this impact is essential, and all possible impacts should be handled similarly.

References

  • 1. Arnold, Robert S., and Shawn A. Bohner. “Impact analysis-towards a framework for comparison.” 1993 Conference on Software Maintenance, CSM-93. IEEE. 1993.
  • 2. Turver, Richard J., and Malcolm Munro. “An early impact analysis technique for software maintenance.” Journal of Software: Evolution and Process 6.1 (1994): 35-52.
  • 3. Hassine, Jameleddine. et al. “Change impact analysis for requirement evolution using use case maps.” Eighth International Workshop on Principles of Software Evolution. IEEE, 2005.
  • 4. Shiri, Maryam. Jameleddine Hassine, and Juergen Rilling. “A requirement level modification analysis support framework.” Third International IEEE Workshop on Software Evolvability 2007. IEEE, 2007.
  • 5. Breech, Ben, et al. “Online impact analysis via dynamic compilation technology.” 20th IEEE International Conference on Software Maintenance, 2004. Proceedings. IEEE. 2004.
  • 6. Law, James, and Gregg Rothermel. “Incremental dynamic impact analysis for evolving software systems.” 14th International Symposium on Software Reliability Engineering, 2003. ISSRE 2003. IEEE, 2003.
  • 7. Bennett. Keith H., and Vaclav T. Rajlich. “Software maintenance and evolution: A roadmap.” Proceedings of the Conference on the Future of Software Engineering. ACM. 2000.
  • 8. Bohner, Shawn Anthony. “A graph traceability approach for software change impact analysis.”, Dissertation, George Mason University. 1995.
  • 9. Horwitz. Susan, Thomas Reps, and David Binkley. “Inter procedural slicing using dependence graphs.” ACM SIGPLAN Notices
  • 39.4 (2004): 229-243.
  • 10. Podgurski, Andy, and Lori A. Clarke. “A formal model of program dependences and its implications for software testing, debugging, and maintenance.” IEEE Transactions on Software Engineering 16.9 (1990): 965-979.
  • 11. Ferrante, Jeanne, Karl J. Ottenstein, and Joe D. Warren. “The program dependence graph and its use in optimization.” ACM Transactions on Programming Languages and Systems 9 (1987): 319-349.
  • 12. Kama. Nazri, and Faizul Azli Abdul Ridzab. “Requirement level impact analysis with impact prediction filter.” 4th International Conference on Software Technology and Engineering (leste 2012). 2012.
  • 13. Kama, Nazri, Tim French, and Mark Reynolds. “Considering patterns in class interactions prediction.” In International Conference on Advanced Software Engineering and Its Applications (2010, December) (pp. 11-22). Springer, Berlin, Heidelberg.
  • 14. Gotel, Orlena Cara Zena. Contribution Structures for Requirements Traceability. Dissertation. University of London, London. United Kongdom. 1995.
  • 15. Ali, Hassan Osman. M. Z. Abd Rozan, and Abdullahi Mohamud Sharif. “Identifying challenges of change impact analysis for software projects.” 2012 International Conference on Innovation Management and Technology Research (ICIMTR). IEEE, 2012.
  • 16. Lee, Michelle L. Change Impact Analysis of Object-Oriented Software. Fairfax, VA: George Mason University, 1998.
  • 17. Pfleeger, Shari Lawrence, and Joanne M. Atlee. Software Engineering: Theory and Practice. Chennai, India: Pearson Education India, 1998.
  • 18. Pfleeger, Shari Lawrence, and Shawn A. Bohner. “A framework for software maintenance metrics.” Conference on Software Maintenance, 1990. Proceedings. IEEE. 1990.
  • 19. Bohner, Shawn A. “Software change impacts-an evolving perspective.” International Conference on Software Maintenance, 2002. Proceedings. IEEE, 2002.
  • 20. Lindvall, Mikael, and Kristian Sandahi. “How well do experienced software developers predict software change?.” Journal of Systems and Software 43.1 (1998): 19-27.
  • 21. Cohen, Jacob. “A coefficient of agreement for nominal scales.” Educational and Psychological Measurement 20.1 (1960): 37-46.
  • 22. Bohner, Shawn A. “Extending software change impact analysis into COTS components.” 27th Annual NASA Goddard/IEEE Software Engineering Workshop, 2002. Proceedings. IEEE, 2002.
  • 23. Hassaine, Salima, et al. “A seismology-inspired approach to study change propagation.” 2011 27th IEEE International Conference on Software Maintenance (ICSM). IEEE, 2011.
  • 24. Popescu, Daniel. “Impact analysis for event-based components and systems.” 2010 АСМ/IEEE 32nd International Conference on Software Engineering, Vol. 2. IEEE. 2010.
  • 25. Ren, Xiaoxia, et al. “Chianti: A tool for change impact analysis of java programs.” ACM Sigplan Notices 39.10 (2004): 432-448.
  • 26. Stoerzer, Maximilian, et al. “Finding failure-inducing changes in java programs using change classification.” Proceedings of the 14th ACM S1GSOFT International Symposium on Foundations of Software Engineering. ACM, 2006.
  • 27. Law, James, and Gregg Rothermel. “Whole program path-based dynamic impact analysis.” Proceedings of the 25th International Conference on Software Engineering. IEEE Computer Society, 2003.
  • 28. Apiwattanapong, Taweesup, Alessandro Orso, and Mary Jean Harrold. “Efficient and precise dynamic impact analysis using execute-after sequences.” Proceedings of the 27th International Conference on Software Engineering. ACM, 2005.
  • 29. Huang, Lulu, and Yeong-Tae Song. “Precise dynamic impact analysis with dependency analysis for object-oriented programs.” Proceedings of the 5th ACIS International Conference on Software Engineering Research, Management & Applications. IEEE Computer Society, 2007.
  • 30. Tip, Frank. A Survey of Program Slicing Techniques. Amsterdam, Netherlands: Centrum voor Wiskunde en Informatica, 1994.
  • 31. Ranganath, Venkatesh Prasad, and John Hatcliff. “Slicing concurrent Java programs using Indus and Kaveri.” International Journal on Software Tools for Technology Transfer 9.5-6 (2007): 489-504.
  • 32. The Wisconsin Program-Slicing Tool, http://research.cs.wisc.edu/ wpis/html/, May 2014. (Accessed on November, 20th 2014).
  • 33. Girba, Tudor, Stéphane Ducasse, and Michele Lanza. “Yesterday’s weather: Guiding early reverse engineering efforts by summarizing the evolution of changes.” 20th IEEE International Conference on Software Maintenance, 2004. Proceedings. IEEE. 2004.
  • 34. Zimmermann, Thomas, et al. “Mining version histories to guide software changes.” IEEE Transactions on Software Engineering
  • 31.6 (2005): 429-445.
  • 35. Kagdi, Huzefa, et al. “Blending conceptual and evolutionary couplings to support change impact analysis in source code.” 2010 17th Working Conference on Reverse Engineering (WCRE). IEEE, 2010.
  • 36. Canfora, Gerardo, et al. “Using multivariate time series and association rules to detect logical change coupling: An empirical study.” 2010 IEEE International Conference on Software Maintenance (ICSM). IEEE, 2010.
  • 37. Bode, Stephan. Quality Goal Oriented Architectural Design and Traceability for Evolvable Software Systems. Diss. Technische Universitat Ilmenau, Germany, 2011.
  • 38. Cavnar, William B., and John M. Trenkle. “N-gram-based text categorization.” Ann Arbor M/48113.2 (1994): 161-175.
  • 39. Marcus, Andrian, and Jonathan I. Maletic. “Recovering docu-mentation-to-source-code traceability links using latent semantic indexing.” Proceedings of the 25th International Conference on Software Engineering. IEEE Computer Society, 2003.
  • 40. Poshyvanyk, Denys, et al. “Using information retrieval based coupling measures for impact analysis.” Empirical Software Engineering 14.1 (2009): 5-32.

41. Binkley, David, and Dawn Lawrie. “Information retrieval applications in software maintenance and evolution.” Encyclopedia of Software Engineering

  • 42. Sharafat, Ali R., and Ladan Tahvildari. “A probabilistic approach to predict changes in object-oriented software systems.” 11th European Conference on Software Maintenance and Reengineering, 2007, CSMR'07. IEEE. 2007.
  • 43. Ceccarelli, Michele, et al. “An eclectic approach for change impact analysis.” (2010).
  • 44. Cabot, Jordi, and Martin Gogolla. “Object constraint language (OCL): a definitive guide.” International School on Formal Methods for the Design of Computer, Communication and Software Systems. Springer, Berlin, Heidelberg, April 2012.
  • 45. World Wide Web Consortium. Xml Path Language (xpath) 2.0. (2010).
  • 46. EMF Query, http://projects.eclipse.org/projects/modeling.emf. query. (Accessed on November, 20th 2014).
  • 47. Briand, Lionel C., Yvan Labiche, and George Soccar. “Automating impact analysis and regression test selection based on UML designs.” International Conference on Software Maintenance, 2002. Proceedings. IEEE, 2002.
  • 48. Miiller, Klaus, and Bernhard Rumpe. “A model-based approach to impact analysis using model differencing.” in Proc. International Workshop on Software Quality and Maintainability (SQM’14), ECEASST Journal, vol. 65, (20І4).
  • 49. Gethers. Malcom, et al. “Integrated impact analysis for managing software changes.” 2012 34th International Conference on Software Engineering (ICSE). IEEE, 2012.
  • 50. Arnold, Robert S. Software Change Impact Analysis. Washington, DC, United States: IEEE Computer Society Press. 1996.
  • 51. Anezin, D. Process and Methods for Requirements Tracing (Software Development Life Cycle). Dissertation, George Mason University. (1994).
  • 52. Han. Jun. “Specifying the structural properties of software documents.” Journal of Computing and Information 1 (1994): 1333-1351.
  • 53. Davis, Jesse, and Mark Goadrich. “The relationship between Precision-Recall and ROC curves.” Proceedings of the 23rd International Conference on Machine Learning. ACM, 2006.
  • 54. Vallabhaneni, S. Rao. Auditing the Maintenance of Software. New Delhi, India: Prentice-Hall, Inc.. 1987.
  • 55. Arthur, L. J., Software Evaluation. New Jersey, United States: John Wiley and Sons. 1999.
  • 56. Jashki, Mohammad-Amin, Reza Zafarani, and Ebrahim Bagheri. “Towards a more efficient static software change impact analysis method.” Proceedings of the 8th ACM SIGPLAN-SIGSOFT Workshop on Program Analysis for Software Tools and Engineering. ACM. 2008.
  • 57. Li, Yin, et al. “Requirement-centric traceability for change impact analysis: A case study.” International Conference on Software Process. Springer, Berlin, Heidelberg, 2008.
  • 58. Amyot, Daniel, and Gunter Mussbacher. “URN: Towards a new standard for the visual description of requirements.” Lecture Notes in Computer Science 2599 (2003): 21-37.
  • 59. Dahlstedt, Asa G., and Anne Persson. “Requirements interdepen-dencies-moulding the state of research into a research agenda.” The Ninth International Workshop on Requirements Engineering: Foundation for Software Quality (REFSQ 2003), Klagenfurt/ Velden. Austria. 2003.
  • 60. Lucia, De. “Information retrieval models for recovering traceability links between code and documentation.” International Conference on Software Maintenance, 2000. Proceedings. IEEE, 2000.
  • 61. Rohatgi, Abhishek, Abdelwahab Hamou-Lhadj. and Juergen Rilling. “An approach for mapping features to code based on static and dynamic analysis.” The 16th IEEE International Conference on Program Comprehension, 2008. ICPC 2008. IEEE. 2008.
  • 62. Spanoudakis, George. “Plausible and adaptive requirement traceability structures.” Proceedings of the 14th International Conference on Software Engineering and Knowledge Engineering. ACM, 2002.
  • 63. O'Neal, James S. “Analyzing the impact of changing requirements.” Proceedings of the IEEE International Conference on Software Maintenance (ICSM'01). IEEE Computer Society, 2001.
  • 64. Adams, Bram, et al. “The evolution of the linux build system.” Electronic Communications of the EASST8 (2008).
  • 65. Seo. Hyunmin, et al. “Programmers’ build errors: A case study (at google).” Proceedings of the 36th International Conference on Software Engineering. ACM, 2014.
  • 66. Kerzazi. Noureddine, Foutse Khomh, and Bram Adams. “Why do automated builds break? an empirical study.” 2014 IEEE International Conference on Software Maintenance and Evolution (ICSME). IEEE, 2014.
  • 67. McIntosh, Shane, et al. “Mining co-change information to understand when build changes are necessary.” 2014 IEEE International Conference on Software Maintenance and Evolution (ICSME). IEEE, 2014.
  • 68. Xia, Xin, et al. “Cross-project build co-change prediction.” 2015 IEEE 22nd International Conference on Software Analysis, Evolution and Reengineering (SANER). IEEE, 2015.
  • 69. Fluri, Beat, and Harald C. Gall. “Classifying change types for qualifying change couplings.” 14th IEEE International Conference on Program Comprehension, 2006. ICPC 2006. IEEE, 2006.
  • 70. Gall, Harald C., Beat Fluri, and Martin Pinzger. “Change analysis with evolizer and changedistiller.” IEEE Software 26.1 (2009): 26.
  • 71. Fluri, Beat, et al. “Change distilling: Tree differencing for finegrained source code change extraction.” IEEE Transactions on Software Engineering 33.11 (2007): 725-743.
  • 72. Hattori, Lile P., and Michele Lanza. “On the nature of commits.” Proceedings of the 23 rd 1EEE/ACM International Conference on Automated Software Engineering. IEEE Press, 2008.
  • 73. Ibrahim, Walid M., et al. “Should I contribute to this discussion?” 2010 7th IEEE Working Conference on Mining Software Repositories (MSR). IEEE. 2010.
  • 74. Knab, Patrick, Martin Pinzger, and Abraham Bernstein. “Predicting defect densities in source code files with decision tree learners.” Proceedings of the 2006 International Workshop on Mining Software Repositories. 2006.
  • 75. Romano, Daniele, and Martin Pinzger. “Using source code metrics to predict change-prone java interfaces.” 2011 27th IEEE International Conference on Software Maintenance (ICSM). IEEE, 2011.
  • 76. Giger, Emanuel, Martin Pinzger, and Harald C. Gall. “Comparing fine-grained source code changes and code churn for bug prediction.” Proceedings of the 8th Working Conference on Mining Software Repositories. ACM, 2011.
  • 77. Lehnert, Steffen. “A review of software change impact analysis.” Ilmenau University of Technology, Ilmenau, Germany. Technical Report (2011).
  • 78. Hammad, Maen, Michael L. Collard, and Jonathan I. Maletic. “Automatically identifying changes that impact code-to-design traceability.” IEEE 17th International Conference on Program Comprehension, 2009, ICPC'09. IEEE. 2009.
  • 79. Kotonya, Gerald, and John Hutchinson, G. Kotonya and J. Hutchinson, “Analysing the impact of change in COTS-based systems.” Lecture Notes in Computer Science 3412 2005: 212-222.
  • 80. Lock, Safoora and Shakil Khan Simon. “Concern tracing and change impact analysis: An exploratory study.” ICSE Workshop on Aspect-Oriented Requirements Engineering and Architecture Design, 2009. EA'09. IEEE, 2009.
  • 81. Ibrahim, Suhaimi, et al. “A requirements traceability to support change impact analysis.” Asian Journal of Information Technology
  • 4.4 (2005): 345-355.
  • 82. Ibrahim, S., N. B. Idris, M. Munro, and A. Deraman. A software traceability validation for change impact analysis of object oriented software. Proceedings of the International Conference on Software Engineering Research and Practice & Conference on Programming Languages and Compilers, SERP 2006, volume 1, pages 453-459, Las Vegas, Nevada, USA. June 2006.
  • 83. Goeritzer, Robert. “Using impact analysis in industry.” 2011 33rd International Conference on Software Engineering (ICSE). IEEE, 2011.
  • 84. de Souza, Cleidson. and David Redmiles. “An empirical study of software developers' management of dependencies and changes.” 2008 АСМ/IEEE 30th International Conference on Software Engineering. IEEE, 2008.
  • 85. Li, Bixin, et al. “A survey of code-based change impact analysis techniques.” Software Testing, Verification and Reliability 23.8 (2013): 613-646.
  • 86. Toth, Gabriella, et al. “Comparison of different impact analysis methods and programmer's opinion: An empirical study.” Proceedings of the 8th International Conference on the Principles and Practice of Programming in Java. ACM, 2010.
  • 87. Weiser, M., Program slicing, ICSE '81: Proceedings of the 5th International Conference on Software Engineering, 439-449, IEEE Press. Piscataway. NJ, USA. 1981.
  • 88. Silva, Josep. “A vocabulary of program slicing-based techniques.” ACM Computing Surveys (CSUR) 44.3 (2012): 12.
  • 89. Harman, Mark, Margaret Okunlawon, Bala Sivagurunathan and Sebastian Danicic. “Slice-based measurement of coupling.” 19th ICSE, Workshop on Process Modeling and Empirical Studies of Software Evolution, Boston, Massachusetts, USA, May 1997.
  • 90. Briand, Lionel C., Jurgen Wust, and Hakim Lounis. “Using coupling measurement for impact analysis in object-oriented systems.” Proceedings IEEE International Conference on Software Maintenance-1999 (ICSM'99).'Software Maintenance for Business Change'(Cat. No. 99CB36360). IEEE, 1999.
  • 91. Harman, Mark, and Robert Hierons. “An overview of program slicing.” Software Focus 2.3 (2001): 85-92.
  • 92. Zanjani, Motahareh Bahrami, George Swartzendruber, and Huzefa Kagdi. “Impact analysis of change requests on source code based on interaction and commit histories.” Proceedings of the 11th Working Conference on Mining Software Repositories. ACM, 2014.
  • 93. Lyle, James R., et al. “Unravel: A CASE tool to assist evaluation of high integrity software. Volume 1: Requirements and design.” National Institute of Standards and Technology, Computer Systems Laboratory, Gaithersburg, MD, 1995.
  • 94. Venkatesh, G. A. “Experimental results from dynamic slicing of C programs.” ACM Transactions on Programming Languages and Systems (TOPLAS) 17.2(1995): 197-216.
  • 95. Teitelbaum. Tim. “Codesurfer.” ACM S1GSOFT Software Engineering Notes 25.1 (2000): 99.
  • 96. Bilal, Haider, and Sue Black. “Using the ripple effect to measure software quality.” International Conference on Software Quality Management. Vol. 13, Cheltenham, Gloucestershire, UK, 2005.
  • 97. Dit, Bogdan, et al. “Impactminer: A tool for change impact analysis.” Companion Proceedings of the 36th International Conference on Software Engineering. ACM, 2014.
  • 98. Lattix Inc. Lattix, 2017. http://lattix.com/. (Accessed on July, 11th 2018)
  • 99. Ranganath, Venkatesh Prasad, and John Hatcliff. “An overview of the indus framework for analysis and slicing of concurrent java software (keynote talk-extended abstract).” 2006 Sixth IEEE International Workshop on Source Code Analysis and Manipulation. IEEE, 2006.
  • 100. Jayaraman, Ganeshan. Venkatesh Prasad Ranganath, and John Hatcliff. “Kaveri: Delivering the indus java program slicer to eclipse.” International Conference on Fundamental Approaches to Software Engineering. Springer, Berlin, Heidelberg, 2005.
  • 101. Buckner, Jonathan, et al. “JRipples: A tool for program comprehension during incremental change.” 13th International Workshop on Program Comprehension (IWPC'05). IEEE, 2005.
  • 102. Coder Gears. ./architect, 2017. http://www.jarchitect.com. (Accessed on August, 27th 2018).
  • 103. Cuoq, Pascal, et al. “Frama-c.” International Conference on Software Engineering and Formal Methods. Springer, Berlin, Heidelberg, 2012.
  • 104. Hattori, Lile, et al. “On the precision and accuracy of impact analysis techniques.” Seventh IEEE/ACIS International Conference on Computer and Information Science (ICIS 2008). IEEE, 2008.
  • 105. Wang, Tao, and Abhik Roychoudhury. “Using compressed bytecode traces for slicing Java programs.” Proceedings of the 26th International Conference on Software Engineering. IEEE Computer Society. 2004.
  • 106. Follett, Matthew, and Orland Hoeber. “ImpactViz: Visualizing class dependencies and the impact of changes in software revisions.” Proceedings of the 5th International Symposium on Software Visualization. ACM, 2010.
  • 107. Zhang, Lingming, Miryung Kim, and Sarfraz Khurshid. “FaultTracer: A change impact and regression fault analysis tool for evolving Java programs.” Proceedings of the ACM SIGSOFT 20th International Symposium on the Foundations of Software Engineering. ACM, 2012.
  • 108. MAcharya, Mithun, and Brian Robinson. “Practical change impact analysis based on static program slicing for industrial software systems.” Proceedings of the 33rd International Conference on Software Engineering. ACM, 2011.
  • 109. Lehnert, Steffen. “A taxonomy for software change impact analysis.” Proceedings of the 12th International Workshop on Principles of Software Evolution and the 7th Annual ERCIM Workshop on Software Evolution. ACM, 2011.
 
<<   CONTENTS

Related topics