Conference Information
ISSTA 2025: International Symposium on Software Testing and Analysis
https://conf.researchr.org/home/ISSTA-2025Submission Date: |
2024-10-31 |
Notification Date: |
2024-12-19 |
Conference Date: |
2025-06-25 |
Location: |
Trondheim, Norway |
Years: |
34 |
CCF: a CORE: a QUALIS: a2 Viewed: 92514 Tracked: 100 Attend: 13
Call For Papers
We invite high-quality submissions, from both industry and academia, describing original and unpublished results of theoretical, empirical, conceptual, and experimental research on software testing and analysis. ISSTA invites three kinds of submissions. The majority of submissions are expected to be “Research Papers”, but submissions that best fit the description of “Experience Papers” or “Replicability Studies” should be submitted as such. A good Experience Paper will include lessons learned or other wisdom synthesised for the community from the reported experience. Replicability Studies shall clearly describe their purpose and value beyond the original result. NEW THIS YEAR: The conference proceedings will be published in the Proceedings of the ACM on Software Engineering (PACMSE), Issue: ISSTA 2025. Research Papers Authors are invited to submit research papers describing original contributions in testing or analysis of computer software. Papers describing original theoretical or empirical research, new techniques, methods for emerging systems, in-depth case studies, infrastructures of testing and analysis, or tools are welcome. Experience Papers Authors are invited to submit experience papers describing a significant experience in applying software testing and analysis methods or tools and should carefully identify and discuss important lessons learned so that other researchers and/or practitioners can benefit from the experience. Replicability Studies ISSTA would like to encourage researchers to replicate results from previous papers. A replicability study must go beyond simply re-implementing an algorithm and/or re-running the artefacts provided by the original paper. It should at the very least apply the approach to new, significantly broadened inputs. Particularly, replicability studies are encouraged to target techniques that previously were evaluated only on proprietary subject programs or inputs. A replicability study should clearly report on results that the authors were able to replicate as well as on aspects of the work that were not replicable. In the latter case, authors are encouraged to make an effort to communicate or collaborate with the original paper’s authors to determine the cause for any observed discrepancies and, if possible, address them (e.g., through minor implementation changes). We explicitly encourage authors to not focus on a single paper/artefact only, but instead to perform a comparative experiment of multiple related approaches. Replicability studies should follow the ACM guidelines on replicability (different team, different experimental setup): the measurement can be obtained with stated precision by a different team, a different measuring system, in a different location on multiple trials. For computational experiments, this means that an independent group can obtain the same result using artefacts which they develop completely independently. Moreover, it is generally also insufficient to focus on reproducibility (i.e., different team, same experimental setup) alone. Replicability Studies will be evaluated according to the following standards: Depth and breadth of experiments Clarity of writing Appropriateness of conclusions Amount of useful, actionable insights Availability of artefacts We expect replicability studies to clearly point out the artefacts the study is built on, and to submit those artefacts to the artefact evaluation. Artefacts evaluated positively will be eligible to obtain the prestigious Results Reproduced badge. Major Revisions Papers submitted to the initial deadline may be accepted, rejected or may receive a chance to submit a major revision of the initial submission to the major revision deadline.
Last updated by Dou Sun in 2024-06-30
Acceptance Ratio
Year | Submitted | Accepted | Accepted(%) |
---|---|---|---|
2014 | 128 | 36 | 28.1% |
2013 | 124 | 32 | 25.8% |
2012 | 108 | 31 | 28.7% |
2011 | 121 | 35 | 28.9% |
2010 | 105 | 24 | 22.9% |
2009 | 93 | 25 | 26.9% |
2008 | 100 | 35 | 35% |
2007 | 103 | 22 | 21.4% |
2006 | 84 | 22 | 26.2% |
2004 | 93 | 28 | 30.1% |
2002 | 97 | 26 | 26.8% |
2000 | 73 | 21 | 28.8% |
1998 | 47 | 16 | 34% |
1996 | 69 | 24 | 34.8% |
Best Papers
Related Conferences
Short | Full Name | Submission | Conference |
---|---|---|---|
ICMAA | International Conference on Mechanical, Aeronautical and Automotive Engineering | 2024-11-30 | 2025-04-02 |
ENASE | International Conference on Evaluation of Novel Approaches to Software Engineering | 2019-02-21 | 2019-05-04 |
IWEP | International Workshop on Engineering Physics | 2023-10-28 | 2023-11-15 |
MOBIWAC | International Symposium on Mobility Management and Wireless Access | 2023-07-15 | 2023-10-30 |
RO-MAN | IEEE International Conference on Robot and Human Interactive Communication | 2018-03-19 | 2018-08-27 |
ICESS | International Conference on Embedded Software and Systems | 2025-03-15 | 2025-06-26 |
Sarnoff | IEEE Sarnoff Symposium | 2019-07-01 | 2019-09-23 |
ICITBE | International Conference on Information Technology and Biomedical Engineering | 2022-09-30 | 2022-12-23 |
CALDAM | International Conference on Algorithms and Discrete Applied Mathematics | 2018-10-01 | 2019-02-14 |
SecTech | International Conference on Security Technology | 2015-10-10 | 2015-11-25 |
Related Journals
CCF | Full Name | Impact Factor | Publisher | ISSN |
---|---|---|---|---|
Nano Today | 13.20 | Elsevier | 1748-0132 | |
International Journal of Information Management | 20.10 | Elsevier | 0268-4012 | |
Journal of Computational Neuroscience | 1.500 | Springer | 0929-5313 | |
Diamond and Related Materials | 4.300 | Elsevier | 0925-9635 | |
c | IET Information Security | 1.300 | IET | 1751-8709 |
Discover Applied Sciences | 2.800 | Springer | 3004-9261 | |
Chemometrics and Intelligent Laboratory Systems | 3.700 | Elsevier | 0169-7439 | |
Materials Science and Engineering: R: Reports | 31.60 | Elsevier | 0927-796X | |
Journal of Computing in Civil Engineering | 4.700 | ASCE | 0887-3801 | |
Journal of Computer Languages | 1.700 | Elsevier | 2665-9182 |
Full Name | Impact Factor | Publisher |
---|---|---|
Nano Today | 13.20 | Elsevier |
International Journal of Information Management | 20.10 | Elsevier |
Journal of Computational Neuroscience | 1.500 | Springer |
Diamond and Related Materials | 4.300 | Elsevier |
IET Information Security | 1.300 | IET |
Discover Applied Sciences | 2.800 | Springer |
Chemometrics and Intelligent Laboratory Systems | 3.700 | Elsevier |
Materials Science and Engineering: R: Reports | 31.60 | Elsevier |
Journal of Computing in Civil Engineering | 4.700 | ASCE |
Journal of Computer Languages | 1.700 | Elsevier |
Recommendation