Markdown Version | Session Recording
Session Date/Time: 04 Nov 2025 19:30
WEBBOTAUTH
Summary
The inaugural meeting of the WEBBOTAUTH working group quickly identified a significant misalignment between its current charter, which focused on developing a specific technical solution (HTTP message signatures for bot authentication), and the community's desire to first establish a clear problem statement, architecture, and comprehensive set of requirements. Presentations on the initial architectural draft and specific solutions were largely sidelined after extensive discussion highlighted fundamental concerns regarding the scope, potential for centralization, privacy implications of bot identification, the cost of proposed mechanisms, and the risk of enabling data discrimination. A strong consensus emerged that the working group needs to step back and re-evaluate its foundational goals and approaches before proceeding with detailed technical specifications.
Key Discussion Points
- Problem Statement and Architecture: Multiple participants (Ecker, Martin Thompson, Brian Campbell) emphasized that the "architecture" draft presented was already a specific solution, not a problem statement or high-level architecture. There was a strong call for a document that clearly defines the problem, desired security properties, and lays out the "why" before diving into technical details.
- Scope and Use Cases:
- Original Intent: Mark Nottingham, a proponent of the work, outlined the initial problem as the "horrid state" of identifying non-browser traffic (e.g., search crawlers, archiving bots) using unreliable methods like user agents or IP addresses, aiming for cryptographic identification as a better primitive.
- Legitimate Scraping: Sarah (Sequentum) presented extensive use cases for legitimate web scraping across commercial, governmental, and civic sectors (e.g., e-commerce price monitoring, fighting human trafficking, NGO hate trackers). She argued that public websites should not have the ability to arbitrarily block such activity and raised concerns about "data discrimination" and the creation of "privileged superhighways" for certain clients.
- Public Interest Actors: Alyssa highlighted the needs of researchers, archivists, and cultural heritage preservationists, noting their lack of resources and potential vulnerability to blocking.
- Controversial Discrimination: Ecker and Victoria Noble (EFF) drew a sharp distinction between managing excessive or abusive traffic (seen as a legitimate technical problem) and enabling sites to discriminate against or exclude bots based on who they are (e.g., competitors, journalists, watchdogs). The latter was viewed as highly problematic and something the IETF should not facilitate. John Levine provided legal context, noting that while
robots.txtis advisory, courts have in some cases upheld a site's right to block specific scrapers.
- Centralization Risk: Several speakers, including Mark Nottingham, Martin Thompson, and Chris Patton, voiced concerns that any proposed solution must actively mitigate risks of centralization, avoiding the creation of new choke points or barriers for smaller entities. They noted that current practices (e.g., relying on IP addresses) already concentrate power.
- Privacy Implications: Chris Patton noted a personal shift in perspective, recognizing that organizational privacy (the right not to be treated differently) is a significant concern. The potential for the "generic anonymous view of the web" to disappear was raised (DKG), and a future where "everybody in the internet has to be in some bot directory or have an iPhone" was explicitly called a "bad world" (Ecker).
- Performance and Cost: Martin Thompson highlighted that the proposed HTTP message signatures are "not cheap" and questioned whether the cost is justifiable for millions of requests, both for clients and servers.
- Technical Alternatives/Considerations:
- Brian Campbell suggested simplified MTLS could be a more efficient approach than per-request signatures.
- Chris Patton proposed anonymous credentials (similar to Privacy Pass) as a privacy-preserving mechanism for applying restrictions like rate limiting, albeit more expensive than simple digital signatures.
- Sandor (co-presenter of the architecture draft) noted that MTLS might take "many years" to implement broadly.
- "Bot" Definition: Discussion clarified that the working group's initial charter aimed to address the identification of a "specific AI agent/product" (e.g., a server machine running a particular bot), rather than linking requests to individual end-users or specific instances of user-controlled software.
- Charter Limitations: Roman pointed out that the current charter is focused solely on writing specifications and does not explicitly allow for the production of architecture or requirements documents, creating a procedural challenge for the desired shift in focus.
Decisions and Action Items
- Immediate Session Restructuring: The chairs decided to forgo the remaining presentations scheduled for the session, as they delved into specific solutions which were deemed premature given the current discussion.
- Re-evaluate Charter: A strong consensus emerged, including acknowledgment from Area Director Mike Bishop, that the working group's charter likely needs to be revisited and potentially redrafted to reflect a problem- and architecture-first approach.
- Shift Discussion to Mailing List: The chairs directed the working group to move the ongoing high-level discussions regarding problems, requirements, and potential architectures to the mailing list.
- Straw Man Draft for Discussion: Mark Nottingham volunteered to draft a document outlining use cases and, critically, "counter use cases" (what the mechanism should not be used for) to guide the re-evaluation process.
Next Steps
- AD Review: Area Director Mike Bishop will review the recording of the session.
- Mailing List Discussion: The working group will use the mailing list to continue discussions on:
- A precise problem statement.
- Detailed use cases and "anti-use cases."
- High-level architectural considerations and requirements for any proposed solution.
- Clarification of underlying assumptions from all stakeholders.
- Straw Man Draft: Mark Nottingham will circulate a draft for discussion on use cases and anti-use cases to facilitate this process.
- Charter Redrafting: Based on the mailing list discussions and the straw man document, the working group will work towards redrafting its charter, with input gathered before submission to the IESG, to better align with the community's consensus.