About SBFT

Optimization techniques can be applied to many aspects of the software development process: research areas known as Search-Based Software Engineering (SBSE). In our previous workshop editions, we focused on the application of SBSE to perform testing tasks, the so called Search-Based Software Testing (SBST). Ongoing research on SBST and Fuzz Testing are proposing techniques to address similar testing problems and with similar goals. This has led to the decision to rename the workshop to Search-Based and Fuzz Testing (SBFT). SBFT strategies have been applied to a wide variety of testing goals including achieving high coverage, finding faults and vulnerabilities, and checking various state-based and non-functional properties (e.g., scalability, acceptance).

The central objective of this workshop is to bring together researchers and industrial practitioners from SBST, Fuzzing, and the wider Software Engineering community to share experience and provide directions for future research on the automation of software testing. The second objective of this workshop is to encourage the use of search and fuzzing techniques to combine testing with other software engineering areas. SBFT is a one-day workshop that comprises a research track, keynotes, and popular testing tool competitions. Additionally, the workshop brings together experts for a panel discussion. All those activities will contribute to break new ground in SBFT research.

Attending SBFT

SBFT 2023 is co-located with ICSE 2023. However, SBFT will be held in a hybrid setting.
In order to attend SBFT, you have to register for our workshop using the official ICSE registration link.
Once you registered for SBFT, the ICSE team will send you an e-mail with the invitation to attend the conference online or in person, according to your preference.

Similar to last year, we will also live-stream SBFT 2023 via Twitch. Feel free to join our stream and ask questions in the chat if you are not registered.

Special Issue

We are please to announce that SBFT'23 is hosting an open call special issue at the Science of Computer Programming Journal, called SBFT'23: Search-Based and Fuzz Testing Tools.

Researchers and practitioners are invited to submit tools showcasing new or improved SBFT approaches. It is important to mention that this call for original software does not publish papers accompanied by software but expects short papers supported by artifacts such as the original software described in the papers together with all the experimental packages, and other data needed for for sake of reproducibility. Hence, we invite submissions showing practical experience of using SBST techniques and tools.

This is the perfect opportunity to showcase your tool. Use the following link for more information and submission instructions: Science of Computer Programming Call

Call for Papers

Researchers and practitioners are invited to submit:

  • Full papers (maximum of 8 pages, including references) Original research in SBFT, either empirical, theoretical, or showing practical experience of using SBFT techniques and/or SBFT tools.
  • Short papers (maximum of 4 pages, including references) Work that describes novel techniques, ideas and positions that have yet to be fully developed; or are a discussion of the importance of a recently published SBFT result by another author in setting a direction for the SBFT community, and/or the potential applicability (or not) of the result in an industrial context.
  • Position papers (maximum of 2 pages, including references) that analyze trends in SBFT and raise issues of importance. Position papers are intended to seed discussion and debate at the workshop, and thus will be reviewed with respect to relevance and their ability to spark discussions.
  • Tool Competition entries (maximum of 4 pages, including references). We invite researchers, students, and tool developers to design innovative new approaches to software test generation.

In all cases, papers should address a problem in the software testing/verification/validation domain or combine elements of those domains with other concerns in the software engineering lifecycle. Examples of problems in the software testing/verification/validation domain include (but are not limited to) generating testing data, fuzzing, prioritizing test cases, constructing test oracles, minimizing test suites, verifying software models, testing service-orientated architectures, constructing test suites for interaction testing, SBFT for AI applications, machine learning techniques for SBFT, and validating realtime properties.

The solution should apply any kind of fuzzing or a metaheuristic search strategy such as (but not limited to) random search, local search (e.g. hill climbing, simulated annealing, and tabu search), evolutionary algorithms (e.g. genetic algorithms, evolution strategies, and genetic programming), ant colony optimization, particle swarm optimization, and multi-objective optimization.

Sponsors

Google IEEE TCSE ACM SIGSOFT
Submission site

EasyChair

Important Dates

Adhering to ICSE’23 workshop dates (AOE):

Printable Flyer

Printable Call for Papers

Paper Submission

January 13 2023

January 20 2023

Competition Report

January 13 2023

January 20 2023

Notification to Authors

February 21, 2023

Camera Ready Due

March 17, 2023

Author's Registration Due

TBA

Submission Guidelines

All submissions must conform to the ICSE’23 formatting and submission instructions. All submissions must be anonymized, in PDF format and should be performed electronically through EasyChair.