Discussion:
Bits from DPL
(too old to reply)
Jecs, Attila
2025-03-04 09:30:01 UTC
Permalink
unsubscribe
Dear Debian community,
this is bits from DPL for February.
Ftpmaster team is seeking for new team members
==============================================
In December, Scott Kitterman announced his retirement from the project.
I personally regret this, as I vividly remember his invaluable support
during the Debian Med sprint at the start of the COVID-19 pandemic. He
even took time off to ensure new packages cleared the queue in under 24
hours. I want to take this opportunity to personally thank Scott for his
contributions during that sprint and for all his work in Debian.
With one fewer FTP assistant, I am concerned about the increased
workload on the remaining team. I encourage anyone in the Debian
community who is interested to consider reaching out to the FTP masters
about joining their team.
If you're wondering about the role of the FTP masters, I'd like to share
- In truth, they are the heart of the project.
- They know it.
- They do a fantastic job."
I fully agree and see it as part of my role as DPL to ensure this
remains true for Debian's future.
If you're looking for a way to support Debian in a critical role where
many developers will deeply appreciate your work, consider reaching out
to the team. It's a great opportunity for any Debian Developer to
contribute to a key part of the project.
[1] https://ftp-master.debian.org/
Project Status: Six Months of Bug of the Day
============================================
In my Bits from the DPL talk at DebConf24[1], I announced the Tiny Tasks
effort, which I intended to start with a Bug of the Day project[2].
Another idea was an Autopkgtest of the Day, but this has been postponed
due to limited time resources-I cannot run both projects in parallel.
The original goal was to provide small, time-bound examples for
newcomers. To put it bluntly: in terms of attracting new contributors,
it has been a failure so far. My offer to explain individual bug-fixing
commits in detail, if needed, received no response, and despite my
efforts to encourage questions, none were asked.
However, the project has several positive aspects: experienced
developers actively exchange ideas, collaborate on fixing bugs, assess
whether packages are worth fixing or should be removed, and work
together to find technical solutions for non-trivial problems.
So far, the project has been engaging and rewarding every day, bringing
new discoveries and challenges-not just technical, but also social.
Fortunately, in the vast majority of cases, I receive positive responses
and appreciation from maintainers. Even in the few instances where help
was declined, it was encouraging to see that in two cases, maintainers
used the ping as motivation to work on their packages themselves. This
reflects the dedication and high standards of maintainers, whose work is
essential to the project's success.
I once used the metaphor that this project is like wandering through a
dark basement with a lone flashlight-exploring aimlessly and discovering
a wide variety of things that have accumulated over the years. Among
them are true marvels with popcon >10,000, ingenious tools, and
delightful games that I only recently learned about. There are also some
packages whose time may have come to an end-but each of them reflects
the dedication and effort of those who maintained them, and that
deserves the utmost respect.
Leaving aside the challenge of attracting newcomers, what have we
achieved since August 1st last year?
* Fixed more than one package per day, typically addressing multiple bugs.
* Added and corrected numerous Homepage fields and watch files.
* The most frequently patched issue was "Fails To Cross-Build From Source"
(all including patches).
* Migrated several packages from cdbs/debhelper to dh.
* Rewrote many d/copyright files to DEP5 format and thoroughly reviewed
them.
* Integrated all affected packages into Salsa and enabled Salsa CI.
* Approximately half of the packages were moved to appropriate teams,
while the rest are maintained within the Debian or Salvage teams.
* Regularly performed team uploads, ITS, NMUs, or QA uploads.
* Filed several RoQA bugs to propose package removals where appropriate.
* Reported multiple maintainers to the MIA team when necessary.
With some goodwill, you can see a slight impact on the trends.debian.net
graphs[3] (thank you Lucas for the graphs), but I would never claim that
this project alone is responsible for the progress. What I have also
observed is the steady stream of daily uploads to the delayed queue[4],
demonstrating the continuous efforts of many contributors. This ongoing
work often remains unseen by most-including myself, if not for my
regular check-ins on this list. I would like to extend my sincere thanks
to everyone pushing fixes there, contributing to the overall quality and
progress of Debian's QA efforts.
If you examine the graphs for "Version Control System" and "VCS Hosting"
with the goodwill mentioned above, you might notice a positive trend
since mid-last year. The "Package Smells" category has also seen
reductions in several areas: "no git", "no DEP5 copyright", "compat <9",
and "not salsa". I'd also like to acknowledge the NMUers who have been
working hard to address the "format != 3.0" issue. Thanks to all their
efforts, this specific issue never surfaced in the Bug of the Day
effort, but their contributions deserve recognition here.
The experience I gathered in this project taught me a lot and inspired
me to some followup we should discuss at a Sprint at DebCamp this year.
Finally, if any newcomer finds this information interesting, I'd be
happy to slow down and patiently explain individual steps as needed. All
it takes is asking questions on the Matrix channel[5] to turn this into
a "teaching by example" session.
By the way, for newcomers who are interested, I used quite a few
abbreviations-all of which are explained in the Debian Glossary[6].
[1] https://debconf24.debconf.org/talks/20-bits-from-the-dpl/
[2]
https://salsa.debian.org/qa/tiny_qa_tools/-/wikis/Tiny-QA-tasks#bug-of-the-day
[3] https://trends.debian.net/
[4] https://ftp-master.debian.org/deferred.html
[5] https://app.element.io/#/room/#debian-tiny-tasks:matrix.org
[6] https://wiki.debian.org/Glossary
Sneak Peek at Upcoming Conferences
==================================
I will join two conferences in March-feel free to talk to me if you spot
me there.
1. FOSSASIA Summit 2025 (March 13-15, Bangkok, Thailand)
Schedule: https://eventyay.com/e/4c0e0c27/schedule
2. Chemnitzer Linux-Tage (March 22-23, Chemnitz, Germany)
Schedule: https://chemnitzer.linux-tage.de/2025/de/programm/vortraege
Both events will have a Debian booth-come say hi!
Kind regards
Andreas.
--
Jecs Attila
Power Alarm Kft.
Jecs, Attila
2025-03-05 12:40:01 UTC
Permalink
unsubsribe
Hello everyone,
Dear Debian community,
this is bits from DPL for February.
Ftpmaster team is seeking for new team members
==============================================
No, we are not.
Andreas asked us whether we would like a call for volunteers included in
Bits. Multiple team members explicitly told him that we now would not
be a good time for that, for us.
For the FTP team,
--
Sean Whitton
--
Jecs Attila
Power Alarm Kft.
Jonathan Dowland
2025-03-05 16:00:01 UTC
Permalink
I genuinely love that there is engagement with Andreas's "Bits from the
DPL" mails, but, it would be lovely if people adjusted the Subject so we
can differentiate sub-topics from each other.
--
Please do not CC me for listmail.

👱🏻 Jonathan Dowland
✎ ***@debian.org
🔗 https://jmtd.net
Nilesh Patra
2025-03-05 18:30:01 UTC
Permalink
Dear Debian community,
this is bits from DPL for February.
Ftpmaster team is seeking for new team members
==============================================
No, we are not.
Andreas asked us whether we would like a call for volunteers included in
Bits. Multiple team members explicitly told him that we now would not
be a good time for that, for us.
Do you mind clarifying why that's the case, unless the reason is truly personal or undisclosable?
Otto Kekäläinen
2025-03-05 19:00:01 UTC
Permalink
Hi,
Post by Nilesh Patra
Dear Debian community,
this is bits from DPL for February.
Ftpmaster team is seeking for new team members
==============================================
No, we are not.
Andreas asked us whether we would like a call for volunteers included in
Bits. Multiple team members explicitly told him that we now would not
be a good time for that, for us.
Do you mind clarifying why that's the case, unless the reason is truly personal or undisclosable?
+1

According to https://ftp-master.debian.org/ there are currently no
'FTP Trainees'. I would assume you would like to have some at all time
to ensure the team has a healthy pipeline of new members being
trained?

Looking at the stats for NEW queue length in
Loading Image... it seems to have
been the highest ever in November 2024 to January 2025, and the
numbers didn't come down until heroic efforts by mainly one person
(Thorsten).

Many of the aspiring Debian Developers I mentor have been stuck with
their work pending in the NEW queue for months. For example an upload
of src:godot to change source package name to src:godot3 with almost
no other changes has been pending almost two months, and the new
contributor Travis Wrightsman has been mostly idle with his Debian
work just waiting for the package to pass in order to proceed with new
Godot version.

With this experience I am surprised that one FTP-team member is saying
that no help is needed? I wonder if that really is the opinion of
others in the team too?

- Otto
Matthias Urlichs
2025-03-06 08:10:02 UTC
Permalink
Post by Otto Kekäläinen
With this experience I am surprised that one FTP-team member is saying
that no help is needed?
Apparently the problem isn't that no help is needed but that nobody has
time to train the new help, citing possible burn-out trying to get
answers from the existing members and leaving in disappointment, if not
disgust. (My interpretation 
)

While that's a valid concern, it's a problem every manager of an
overworked team in the world has faced, volunteer or not.

There are (of course) multiple ways to approach this issue. The point
(and I assume the reason Andreas basically ignored the team's rejection
of new members) is that "do nothing until somebody has time to train new
people" is among the worst possible approaches: experience tells us that
the most likely outcome is "another team members quits".

--
-- regards
--
-- Matthias Urlichs
Sean Whitton
2025-03-06 10:00:01 UTC
Permalink
Hello,
Apparently the problem isn't that no help is needed but that nobody has time
to train the new help, citing possible burn-out trying to get answers from the
existing members and leaving in disappointment, if not disgust. (My
interpretation 
)
While that's a valid concern, it's a problem every manager of an overworked
team in the world has faced, volunteer or not.
There are (of course) multiple ways to approach this issue. The point (and I
assume the reason Andreas basically ignored the team's rejection of new
members) is that "do nothing until somebody has time to train new people" is
among the worst possible approaches: experience tells us that the most likely
outcome is "another team members quits".
You can't just throw people at a team of volunteers who are busy doing
other things and say "train them". Nobody wins, there, and the
candidates won't come back at a time when those volunteers *do* have the
time to do the training.
--
Sean Whitton
Matthias Urlichs
2025-03-06 13:10:01 UTC
Permalink
Post by Sean Whitton
You can't just throw people at a team of volunteers who are busy doing
other things and say "train them".
That's true in general. However.

* this episode demonstrates that there are obviously a few crossed wires
between ftpmaster and DPL; I think it's fair to assume that this is not
a recent development. Andreas' ignoring your NACK may not have been
particularly nice (he can apologize himself :-P ) but at least it threw
the problem into the spotlight.

* there seem to be some reasonable(IMHO) ideas out there to reduce
and/or spread the workload of NEW processing. These obviously need some
fleshed-out proposal, discussion, and people who then implement the
result. This requires volunteers , but not necessarily any up-front
training.

* I have learned (thanks @roehling) that the *actual* median time
packages spend in NEW is less than two days. In other words, *somebody*
must have *some* time available.

* Speaking from personal experience: Fighting an ongoing uphill battle
is much less rewarding than bulldozing some of that hill away. The
effect on actual time available for the task in question should be obvious.

My personal suggestion would be to work with one or two volunteers to
write a somewhat-comprehensive how-to-ftpmaster-the-NEW-queue manual, so
that the *next* time you have a bottleneck you can throw that document
at the volunteer and say "here's ten example packages, find their
problems if any, then come back".

Finally, a question -- as you don't seem to document the issues you have
with long term packages in their ITP bug, where *do* you document them?

--
-- regards
--
-- Matthias Urlichs
Sean Whitton
2025-03-09 04:40:01 UTC
Permalink
Hello,
spend in NEW is less than two days. In other words, *somebody* must have
*some* time available.
It is almost entirely Thorsten.
My personal suggestion would be to work with one or two volunteers to write a
somewhat-comprehensive how-to-ftpmaster-the-NEW-queue manual, so that the
*next* time you have a bottleneck you can throw that document at the volunteer
and say "here's ten example packages, find their problems if any, then come
back".
The docs are public: https://salsa.debian.org/ftp-team/manpages
Finally, a question -- as you don't seem to document the issues you have with
long term packages in their ITP bug, where *do* you document them?
There is a notes system in the 'dak process-new' command.
--
Sean Whitton
Simon Josefsson
2025-03-09 11:20:01 UTC
Permalink
Post by Sean Whitton
My personal suggestion would be to work with one or two volunteers to write a
somewhat-comprehensive how-to-ftpmaster-the-NEW-queue manual, so that the
*next* time you have a bottleneck you can throw that document at the volunteer
and say "here's ten example packages, find their problems if any, then come
back".
The docs are public: https://salsa.debian.org/ftp-team/manpages
Those are helpful even for me as uploading packages to NEW! I wish I
had read them before.

Is there any policy to forbid or accept uploading two packages A and B
at the same time to NEW where package A depends on package B?

I am guessing based on earlier e-mail interactions that this causes
problems for your review workflow, so I've stopped doing simultaneous
uploads and instead upload package B first and then wait for ACCEPT and
then upload package A etc.

That policy lead to a slow migration path, since for many Go packages
the dependency chain of NEW packages can be quite deep. If I recall
correctly, I had this dependency situation earlier:

sigstore
-> sigstore-go
-> cosign
-> gittuf
-> gitsign

It would be nice to clarify in those documents if indeed there is a
policy on this. It seems surprising to me, since it would make it
impossible to get a set of NEW packages which have cyclic dependencies
into Debian.

Of course, a more gray situation like "we don't want to forbid it but it
causes us more work so please don't do it" is perfectly fine too.
Writing that down helps people do the right thing. The question has
come up two times for me when sponsoring new Go package uploads.

/Simon
Simon Josefsson
2025-03-09 11:40:02 UTC
Permalink
Post by Sean Whitton
Hello,
Post by Simon Josefsson
Post by Sean Whitton
The docs are public: https://salsa.debian.org/ftp-team/manpages
Those are helpful even for me as uploading packages to NEW! I wish I
had read them before.
Mmm. They sat private access only for ten years, but when I joined the
FTP team I worked to get them published.
Post by Simon Josefsson
Is there any policy to forbid or accept uploading two packages A and B
at the same time to NEW where package A depends on package B?
No such policy. 'dak inspect-upload' is meant to highlight unmet deps
in red so that we don't ACCEPT them in the wrong order, but we often
miss it.
IMO it is the maintainer's responsibility to ensure that NEW+unstable
together is always all installable, if you see what I mean.
Thank you for clarifying, and for getting those documents published.

What should I do if NEW+unstable becomes uninstallable during the NEW
review period?

Do you want maintainers to re-upload a newly built binary? I've never
done that, but doing so would make sense if you really want maintainers
to ensure that NEW+unstable is installable.

A binary Go package can have 500+ build dependencies transitively, and
the chance of all of those packages staying at the same version in
unstable during NEW review period is pretty slim. I guess that you
already work around this, because I have only very rarely gotten REJECTS
because of this reason (guessing max 3 times), and I know the situation
must have occured for several packages that I did get ACCEPT on.

/Simon
Sean Whitton
2025-03-09 11:50:01 UTC
Permalink
Hello,
Post by Simon Josefsson
What should I do if NEW+unstable becomes uninstallable during the NEW
review period?
Do you want maintainers to re-upload a newly built binary? I've never
done that, but doing so would make sense if you really want maintainers
to ensure that NEW+unstable is installable.
A binary Go package can have 500+ build dependencies transitively, and
the chance of all of those packages staying at the same version in
unstable during NEW review period is pretty slim. I guess that you
already work around this, because I have only very rarely gotten REJECTS
because of this reason (guessing max 3 times), and I know the situation
must have occured for several packages that I did get ACCEPT on.
Hmm, your answer makes me think that I didn't understand the question
correctly.

So, a more general answer: we mostly care about individual packages and
rely on maintainers and britney to care about how they interact.
But if we see something which seems obviously like introducing breakage
we'll probably ask you about it, and if you tell us it's fine, we'll
probably go ahead.
--
Sean Whitton
Sean Whitton
2025-03-09 11:40:02 UTC
Permalink
Hello,
Post by Simon Josefsson
Post by Sean Whitton
The docs are public: https://salsa.debian.org/ftp-team/manpages
Those are helpful even for me as uploading packages to NEW! I wish I
had read them before.
Mmm. They sat private access only for ten years, but when I joined the
FTP team I worked to get them published.
Post by Simon Josefsson
Is there any policy to forbid or accept uploading two packages A and B
at the same time to NEW where package A depends on package B?
No such policy. 'dak inspect-upload' is meant to highlight unmet deps
in red so that we don't ACCEPT them in the wrong order, but we often
miss it.

IMO it is the maintainer's responsibility to ensure that NEW+unstable
together is always all installable, if you see what I mean.
--
Sean Whitton
G. Branden Robinson
2025-03-09 11:50:01 UTC
Permalink
Post by Sean Whitton
Post by Simon Josefsson
Post by Sean Whitton
The docs are public: https://salsa.debian.org/ftp-team/manpages
Those are helpful even for me as uploading packages to NEW! I wish
I had read them before.
Mmm. They sat private access only for ten years,
Why?
Post by Sean Whitton
but when I joined the FTP team I worked to get them published.
Thank you!

Regards,
Branden
Simon McVittie
2025-03-09 14:00:02 UTC
Permalink
Post by Sean Whitton
IMO it is the maintainer's responsibility to ensure that NEW+unstable
together is always all installable, if you see what I mean.
Do I assume correctly that this principle can be weakened for
experimental-NEW?

As a general principle I think uploads to NEW that are more complicated
than a completely new leaf package should usually be to experimental,
unless there is a reason why this specific package can't (for example if
foo_2.0 is already in experimental and now the maintainer needs a
package-split or a new SONAME for foo_1.2 in unstable). A lot of the
time the NEW package will need a new sourceful upload after it's been
accepted *anyway*, to get a source-only upload that can migrate to
testing - and if the package is in binary-NEW because it has a new
SONAME, it's better to have the maintainer and not the ftp team be in
control of the point at which it hits unstable and starts a transition.

Does the ftp team agree with that as a general idea? And if a largeish
dependency graph needs uploading together, is it OK to upload them all
to experimental-NEW, with the idea that if the ftp team accepts them in
the wrong order they'll just sit in BD-Uninstallable status until the
whole batch has been processed, with no real harm done?

smcv
Sean Whitton
2025-03-18 23:30:01 UTC
Permalink
Hello,
Post by Simon McVittie
Do I assume correctly that this principle can be weakened for
experimental-NEW?
As a general principle I think uploads to NEW that are more complicated than a
completely new leaf package should usually be to experimental, unless there is
a reason why this specific package can't (for example if foo_2.0 is already in
experimental and now the maintainer needs a package-split or a new SONAME for
foo_1.2 in unstable). A lot of the time the NEW package will need a new
sourceful upload after it's been accepted *anyway*, to get a source-only
upload that can migrate to testing - and if the package is in binary-NEW
because it has a new SONAME, it's better to have the maintainer and not the
ftp team be in control of the point at which it hits unstable and starts a
transition.
Does the ftp team agree with that as a general idea? And if a largeish
dependency graph needs uploading together, is it OK to upload them all to
experimental-NEW, with the idea that if the ftp team accepts them in the wrong
order they'll just sit in BD-Uninstallable status until the whole batch has
been processed, with no real harm done?
Yes, I think that is fine.
--
Sean Whitton
Simon Josefsson
2025-03-10 10:10:02 UTC
Permalink
The rationale given when I joined as ftpassistant (c. 2012) for not
publicising decisions e.g. in the ITP was to avoid publishing
potentially harshly-worded and embarassing reviews to maintainers in
public (like pointing out that you missed a fairly obvious license
declaration, incompatibility, or packaging step).
I am welcome to feedback from the project as to whether this outweighs
the benefit to having past decisions available for public
consultation.
If that is really the only rationale, I think the reviews ought to be
public. As an offender of fairly obvious and embarrasing license
mistakes, and other NEW packaging problems, I believe the only
sustainable way to improve is to have more eyes looking at things and
contributing and doing things in public allows people to learn how the
process works, and participate.

Charles Plessy's effort to have a pre-NEW review team to do such work
seems like a good start (although I never figured out how I would submit
a package to that effort).

I can see the need for doing private reviews with private feedback
though. Maybe what is needed is not so much to change ftp-master's
private review process but to have this public pre-review process to
smoothen the process a bit.

/Simon
Philip Hands
2025-03-10 10:50:01 UTC
Permalink
The rationale given when I joined as ftpassistant (c. 2012) for not
publicising decisions e.g. in the ITP was to avoid publishing
potentially harshly-worded and embarassing reviews to maintainers in
public (like pointing out that you missed a fairly obvious license
declaration, incompatibility, or packaging step).
I am welcome to feedback from the project as to whether this outweighs
the benefit to having past decisions available for public consultation.
If the price for the ability to learn from the mistakes of others is an
occasional dose of public humiliation, then that's a price I'm happy to
pay (and I speak as someone that has a talent for making trivial errors).

Also, we claim in the Debian Social Contract that we don't hide problems.

How about if the (possibly harsh) reasoning were published in a form
that only directly tied it to the package name, such that search engines
would not instantly and permanently place that comment on one's CV?

I'd imagine that the stigma of a rejection would pretty quickly become
an understanding that everyone makes mistakes occasionally, which may be
a good way of avoiding new contributors becoming intimidated by the
assumption that everyone else is doing a perfect job.

Cheers, Phil.
--
Philip Hands -- https://hands.com/~phil
Jonathan Dowland
2025-03-10 12:00:01 UTC
Permalink
I've recently been trying to help rescue a package that is dropped for
Trixie, partly for technical reasons (source package split means a round
trip through NEW) and party for license reasons (some uncertainty about
copyright of some icons, which have been in the archive for decades, but
since a NEW round-trip is required, this is a reject-worthy bug now)
We discard the source tarballs and changes files on REJECT so there is
nothing to `debdiff`. This partially happens for legal reasons: if we
determine a package is not suitable for the archive then we may no
longer have the legal right to retain it on ftp-master.
That makes sense. In my case, I still don't have access to the source
package that was rejected, but that could be solved if the (very busy)
maintainer uploaded it somewhere else (e.g. to Salsa). Since it's never
been in Debian (technically), there's no historic packages to look at
(yet).
The rationale given when I joined as ftpassistant (c. 2012) for not
publicising decisions e.g. in the ITP was to avoid publishing
potentially harshly-worded and embarassing reviews to maintainers in
public (like pointing out that you missed a fairly obvious license
declaration, incompatibility, or packaging step).
I am welcome to feedback from the project as to whether this outweighs
the benefit to having past decisions available for public consultation.
I had to ask nicely for someone with privileges to send me the ftp team
reject notes to get some clue as to what needs fixing. So I would
definitely prefer if they were open by default.


Thanks for your efforts!
--
Please do not CC me for listmail.

👱🏻 Jonathan Dowland
✎ ***@debian.org
🔗 https://jmtd.net
Otto Kekäläinen
2025-03-06 19:20:01 UTC
Permalink
Hi,
Post by Sean Whitton
Apparently the problem isn't that no help is needed but that nobody has time
to train the new help, citing possible burn-out trying to get answers from the
existing members and leaving in disappointment, if not disgust. (My
interpretation …)
While that's a valid concern, it's a problem every manager of an overworked
team in the world has faced, volunteer or not.
There are (of course) multiple ways to approach this issue. The point (and I
assume the reason Andreas basically ignored the team's rejection of new
members) is that "do nothing until somebody has time to train new people" is
among the worst possible approaches: experience tells us that the most likely
outcome is "another team members quits".
You can't just throw people at a team of volunteers who are busy doing
other things and say "train them". Nobody wins, there, and the
candidates won't come back at a time when those volunteers *do* have the
time to do the training.
I don't think you are quoting the DPL above correctly. I think he had
good judgement, and raising awareness of FTP Masters team being spread
thin and needing more help in a Bits from DPL announcement is the
correct thing to do.

New people standing up and stating they want to help is a good thing,
even with the risk that some of those people would go away while
waiting.

I did also read the queue processing time reports by Matthias and
Timo. On a quick look I wasn't able to find stats on which FTP Master
team member has done how many reviews, but in my experience it seems
to rely on the heroic efforts of a very few people (thanks Thorsten
for all your work!), and having more people in the team would be of
great benefit for Debian, and rightly something the DPL should help
with.
Sean Whitton
2025-03-06 01:10:01 UTC
Permalink
Hello,
Post by Nilesh Patra
Do you mind clarifying why that's the case, unless the reason is truly
personal or undisclosable?
It's pretty simple -- there is no-one with the free time to train them
right now, in which case trainees will simply burn out, because they
won't get enough feedback on their NEW reviews.

We try to recruit only when there is someone who is able to dedicate
some time to training. That depends on what the other team members are
busy with, in and outside of Debian, at a given time.

This was made very clear to Andreas.
--
Sean Whitton
Charles Plessy
2025-03-06 04:30:01 UTC
Permalink
Hi Sean and everybody,

Around 12 years ago, I proposed a peer-review system to increase the quality of
the packages in the NEW queue. https://wiki.debian.org/CopyrightReview

Maybe we could revisit the idea along these lines:

- a Salsa group into which people fork repos and run CI screens for copyright,
license and missing source issues.

- a peer-review system based on issues or MRs (for instance to a master
repository with a text file tracing the outcome of the reviews).

- as of today people would use it to ensure their submissions to NEW are to
the highest standards and therefore the least likely to waste time of the
FTP team members.

- the outcome of the NEW processing of the peer review is also recorded by
volunteers, allowing to better measure the achievements and usefulness of
the system.

- The FTP team, if they wish, can provide feedback.

- when the FTP team calls for new trainees, applicants who have a track record
of peer reviews in that system can show it to the FTP team, who are free to
do what they want with this information.

- If the FTP team recruits someobody who was peer reviewer and liked that
system, a positive loop is made.

Have a nice day,

Charles
--
Charles Plessy Nagahama, Yomitan, Okinawa, Japan
Debian Med packaging team http://www.debian.org/devel/debian-med
Tooting from home https://framapiaf.org/@charles_plessy
- You do not have my permission to use this email to train an AI -
Sean Whitton
2025-03-06 09:50:01 UTC
Permalink
Hello,
Post by Charles Plessy
Hi Sean and everybody,
Around 12 years ago, I proposed a peer-review system to increase the quality of
the packages in the NEW queue. https://wiki.debian.org/CopyrightReview
If someone wants to set this up in a way that doesn't increase ftp team
workload but means packages have to be reject'd less often -- by all
means.
--
Sean Whitton
Julien Plissonneau Duquène
2025-03-06 10:00:01 UTC
Permalink
Hi,
Post by Sean Whitton
If someone wants to set this up in a way that doesn't increase ftp team
workload but means packages have to be reject'd less often -- by all
means.
Do you have some stats or even just an estimate telling how often this
happens, or is there an archive or a log somewhere that could be used to
estimate the rejection frequency and most common causes?

Cheers,
--
Julien Plissonneau Duquène
Charles Plessy
2025-03-07 01:00:02 UTC
Permalink
Post by Sean Whitton
Post by Charles Plessy
Around 12 years ago, I proposed a peer-review system to increase the quality of
the packages in the NEW queue. https://wiki.debian.org/CopyrightReview
If someone wants to set this up in a way that doesn't increase ftp team
workload but means packages have to be reject'd less often -- by all
means.
Thanks Sean, Thorsten and everybody else for the positive feedback.

I have prepared a stub for a "Gateway to NEW" on Salsa:

https://salsa.debian.org/newgateway-team

I added `Debian` as a team member.

I am under the impression that forking repositories will not be necessary: if
we provide CI pipeline packages like the salsa-ci project, and smart ways to
turn them on and off, then people can run their tests in their own
repositories. I have some new r-cran-* packages to prepare next week; I will
use them as a proof of principle. Everybody is welcome to be faster than me to
test the idea.

I am not entirely sure on where to continue the discussion, but maybe we can
try to leverage Salsa as much as possible for that as well?

Have a nice day,

Charles
--
Charles Plessy Nagahama, Yomitan, Okinawa, Japan
Debian Med packaging team http://www.debian.org/devel/debian-med
Tooting from home https://framapiaf.org/@charles_plessy
- You do not have my permission to use this email to train an AI -
Simon Josefsson
2025-03-07 17:30:01 UTC
Permalink
Post by Charles Plessy
https://salsa.debian.org/newgateway-team
I added `Debian` as a team member.
I am under the impression that forking repositories will not be necessary: if
we provide CI pipeline packages like the salsa-ci project, and smart ways to
turn them on and off, then people can run their tests in their own
repositories. I have some new r-cran-* packages to prepare next week; I will
use them as a proof of principle. Everybody is welcome to be faster than me to
test the idea.
I am not entirely sure on where to continue the discussion, but maybe we can
try to leverage Salsa as much as possible for that as well?
Thanks for starting this -- could you re-enable Issues for the Pipelines
project?

I suggest to use 'lrc' in the pipeline. I already do this for many
packages, and I just add

- https://salsa.debian.org/debian/licenserecon/raw/main/debian/licenserecon.yml

to the debian/salsa-ci.yml file, see entire example file below. It will
cause a CI failure on any debian/copyright mistakes. Yes, false
positives happens, and it doesn't always handle Autotools projects with
a lot of generated files with complex licenses well.

/Simon

include:
- https://salsa.debian.org/salsa-ci-team/pipeline/raw/master/recipes/debian.yml
- https://salsa.debian.org/debian/licenserecon/raw/main/debian/licenserecon.yml

variables:
SALSA_CI_ENABLE_WRAP_AND_SORT: 'true'
SALSA_CI_WRAP_AND_SORT_ARGS: '-asbkt'
SALSA_CI_DISABLE_APTLY: 0
SALSA_CI_LINTIAN_FAIL_WARNING: '1'
SALSA_CI_AUTOPKGTEST_ALLOWED_EXIT_STATUS: '0'
Charles Plessy
2025-03-08 01:10:01 UTC
Permalink
Post by Simon Josefsson
Thanks for starting this -- could you re-enable Issues for the Pipelines
project?
Hi Simon,

I have enabled the issues in all repository. It seems that Salsa's
policy is to have them disabled by default.
Post by Simon Josefsson
I suggest to use 'lrc' in the pipeline. I already do this for many
packages, and I just add
- https://salsa.debian.org/debian/licenserecon/raw/main/debian/licenserecon.yml
Looks good!
Post by Simon Josefsson
Yes, false positives happens, and it doesn't always handle Autotools
projects with a lot of generated files with complex licenses well.
Here we are in the context of entirely new packages, so we can explore
the idea that packages need either to be licenserecon-clean, or to
include a note why they can't, in order to get a review. For instance,
the form to request a review (issue, MR, or service counter, I am not
sure yet), could contain a checklist item about this.

By the way, Simon and everybody elese, please feel free to ask for
elevated Salsa priviledges as soon as you need and as long as the list
of admins does not look already too long to you.

Have a nice day,

Charles
--
Charles Plessy Nagahama, Yomitan, Okinawa, Japan
Debian Med packaging team http://www.debian.org/devel/debian-med
Tooting from work, https://fediscience.org/@charles_plessy
Tooting from home, https://framapiaf.org/@charles_plessy
Simon Josefsson
2025-03-08 08:50:01 UTC
Permalink
Post by Charles Plessy
Post by Simon Josefsson
I suggest to use 'lrc' in the pipeline. I already do this for many
packages, and I just add
- https://salsa.debian.org/debian/licenserecon/raw/main/debian/licenserecon.yml
Looks good!
Post by Simon Josefsson
Yes, false positives happens, and it doesn't always handle Autotools
projects with a lot of generated files with complex licenses well.
Here we are in the context of entirely new packages, so we can explore
the idea that packages need either to be licenserecon-clean, or to
include a note why they can't, in order to get a review. For instance,
the form to request a review (issue, MR, or service counter, I am not
sure yet), could contain a checklist item about this.
You can add exceptions, similar to lintian overrides, for known false
positives:

https://salsa.debian.org/debian/gssproxy/-/blob/master/debian/lrc.excludes?ref_type=heads
https://salsa.debian.org/go-team/packages/golang-github-sigstore-protobuf-specs/-/blob/debian/sid/debian/lrc.config?ref_type=heads

I use it for a bunch of packages, although I have to admit that on
complex false positives I tend to disable it rather than trying to
figure out how to write the exception file and/or file bug reports (bugs
which tends to more often tend to be in licensecheck rather than
licenserecon).

It would be nice to add this to the standard Salsa CI pipeline:

https://salsa.debian.org/salsa-ci-team/pipeline/-/issues/395

The difference of having a 'include' statement in debian/salsa-ci.yml is
not that different from adding some 'variables:' to enable a lrc-job, so
it is not critical to add it to the standard pipeline. Maybe if more
people start to use it we gain more confidence in it as a useful tool,
and later on add it to the standard pipeline.

/Simon
Sean Whitton
2025-03-09 05:30:01 UTC
Permalink
Hello Charles,

Thanks. Please put prominent links to these three places:

- Policy 2.3 -- this covers 90% of my NEW rejects

Based on my experience processing NEW, a lot of DDs don't seem to
really have an understanding of the requirements explained here.
Including me, before I joined the ftp team.
I updated this section to try to capture what I learned.

- Policy 12.5 -- covers some of the othe REJECTs

- the REJECT-FAQ.
--
Sean Whitton
Philip Hands
2025-03-10 10:40:01 UTC
Permalink
Post by Sean Whitton
Hello Charles,
- Policy 2.3 -- this covers 90% of my NEW rejects
Based on my experience processing NEW, a lot of DDs don't seem to
really have an understanding of the requirements explained here.
Including me, before I joined the ftp team.
I updated this section to try to capture what I learned.
- Policy 12.5 -- covers some of the othe REJECTs
- the REJECT-FAQ.
Might it be worth linking to those policy sections, here perhaps:

https://ftp-master.debian.org/#rejections

and then linking to that from the NEW summary page.

People looking at the NEW summary page are quite likely to be wondering
if they may have done something wrong, so will be motivated to follow
the links, and might even manage to notice a reason for a rejection by
themselves, and fix it unprompted.

Cheers, Phil.
--
Philip Hands -- https://hands.com/~phil
Maytham Alsudany
2025-03-09 10:10:04 UTC
Permalink
Post by Charles Plessy
https://salsa.debian.org/newgateway-team
I've got a couple of questions:

Am I correct in assuming that each package to be reviewed will be an
issue under the "reviews" repo (and possibly also mention the relevant
package maintainer there)?

Will packages be reviewed upon request, or will it be fine to pick out
packages from the NEW queue and review them to assist FTP masters?

Thanks for working on this.
--
Maytham
Charles Plessy
2025-03-11 09:00:01 UTC
Permalink
Post by Maytham Alsudany
Post by Charles Plessy
https://salsa.debian.org/newgateway-team
Am I correct in assuming that each package to be reviewed will be an
issue under the "reviews" repo (and possibly also mention the relevant
package maintainer there)?
Will packages be reviewed upon request, or will it be fine to pick out
packages from the NEW queue and review them to assist FTP masters?
The way I envision it, is that people will ask for peer review before uploading
to the NEW queue. One of the reasons is that we will provide CI pipelines and
checklists that assist the preparation of the `debian/copyright` file, and once
the maintainer has used that system, requesting a review will be only a few
clicks away. At the moment I have the impression that ussing issues is the
easiest way, but alternatively Merge Requests on the file that tracks packages
and their reviews could work too.

I have made a very simple proof or concept here:

https://salsa.debian.org/newgateway-team/reviews/-/blob/main/README.md?ref_type=heads

I have not finished to collect the contents of the checklists, and I am just at
the beginning of learning how CI pipelines work on GitLab. People intersted in
making the NEW gateway happen, your contribution is most welcome! It can not
be the pet project of a single person. In any case, I will use it for my packages
and will do my best to convince my team mates to do so too :)

Have a nice day,

Charles
--
Charles Plessy Nagahama, Yomitan, Okinawa, Japan
Debian Med packaging team http://www.debian.org/devel/debian-med
Tooting from home https://framapiaf.org/@charles_plessy
- You do not have my permission to use this email to train an AI -
Simon Josefsson
2025-03-11 15:40:02 UTC
Permalink
Post by Charles Plessy
Post by Maytham Alsudany
Post by Charles Plessy
https://salsa.debian.org/newgateway-team
Am I correct in assuming that each package to be reviewed will be an
issue under the "reviews" repo (and possibly also mention the relevant
package maintainer there)?
Will packages be reviewed upon request, or will it be fine to pick out
packages from the NEW queue and review them to assist FTP masters?
The way I envision it, is that people will ask for peer review before uploading
to the NEW queue.
Could you explain how I would ask for review of a package? I re-read
this thread, and the newgateway-team homepages, but I still don't
understand how you think the process should work.

Could we test the process by reviewing 'litetlog'?

https://salsa.debian.org/go-team/packages/litetlog/

It is already in NEW queue, but maybe more eyes on it will catch
mistakes before ftp-master review. It will help me understand what you
think the process should be here.

/Simon
Charles Plessy
2025-03-12 01:00:01 UTC
Permalink
Post by Simon Josefsson
Could you explain how I would ask for review of a package? I re-read
this thread, and the newgateway-team homepages, but I still don't
understand how you think the process should work.
Could we test the process by reviewing 'litetlog'?
Hi Simon,

I have just drafted a workflow in

https://salsa.debian.org/newgateway-team/reviews#how-to-request-or-make-a-review

which I quote here:

0. (We are in pilot phases. Improvements of this workflow are welcome)

1. The package maintainer adds the
[pipelines](https://salsa.debian.org/newgateway-team/pipelines) to its Salsa CI
file. (It would be cool to have a _devscripts_ script for that.)

2. The package maintainer opens an issue with _Review_ template (shall we
just make it default?). Salsa ID pings in the issue can be useful for
exchanging reviews.

3. Once the checklist is clear, the maintainer uses the create merge request
button in the issue page, to add the package in the table below. (Or just
editing the file directly is fine?)

4. Somebody merges the request after verifying quickly that the checklist was
properly addressed.

5. Reviewers open issues with the _Review_ template. If problems are found,
they ping the maintainer with their salsa ID in the issue discussion. Reviews
end by adding the issue ID in the table below via a MR to be accepted and
merged by the package maintainer.

6. Once all reviewers thumbs are up, update the table below (with or without
MR), and upload to NEW.

7. Once the package leaves the NEW queue, record the outcome in the table below.


Surely, please join the tests with litelog!

Charles
--
Charles Plessy Nagahama, Yomitan, Okinawa, Japan
Debian Med packaging team http://www.debian.org/devel/debian-med
Tooting from home https://framapiaf.org/@charles_plessy
- You do not have my permission to use this email to train an AI -
Maytham Alsudany
2025-03-12 03:10:02 UTC
Permalink
Post by Charles Plessy
I have just drafted a workflow in
https://salsa.debian.org/newgateway-team/reviews#how-to-request-or-make-a-review
0. (We are in pilot phases. Improvements of this workflow are welcome)
1. The package maintainer adds the
[pipelines](https://salsa.debian.org/newgateway-team/pipelines) to its Salsa CI
file. (It would be cool to have a _devscripts_ script for that.)
I like the idea of pipelines that only create artifacts, without
creating a pass/fail result that affects the package's overall CI.
Post by Charles Plessy
2. The package maintainer opens an issue with _Review_ template (shall we
just make it default?). Salsa ID pings in the issue can be useful for
exchanging reviews.
3. Once the checklist is clear, the maintainer uses the create merge request
button in the issue page, to add the package in the table below. (Or just
editing the file directly is fine?)
Wouldn't using an issue's open/closed status and tags be sufficient
rather than MRs for each issue? For instance:
- open = not in the archive yet
- closed = in the archive / abandoned
- tag "passed-review"
- tag "accepted" when the package enters the archive
I also suggest using a standard naming scheme like "package/version"
(e.g. "r-cran-multitaper/1.0-17-1") for easy lookup of packages as well
as different versions (for binNEW).

Then, if ftp-masters want to check for a peer-review, they can just look
for the name of the package in the list of open issues, named with a
standard "package-name/package-version". 

And if the table in README is still necessary, then (I think) a CI
pipeline can update it from the issue data.
Post by Charles Plessy
4. Somebody merges the request after verifying quickly that the checklist was
properly addressed.
5. Reviewers open issues with the _Review_ template. If problems are found,
they ping the maintainer with their salsa ID in the issue discussion. Reviews
end by adding the issue ID in the table below via a MR to be accepted and
merged by the package maintainer.
6. Once all reviewers thumbs are up, update the table below (with or without
MR), and upload to NEW.
7. Once the package leaves the NEW queue, record the outcome in the table below.
Some definition of scope would be useful (e.g. "we don't check if the
program runs or if the package builds (that's the responsibility of the
uploader), we just do the same checks as those that happen in the NEW
queue").

Let me know how I can help! I enjoy reviewing copyright files, so if any
arise, please send them my way :)
--
Maytham
Simon Josefsson
2025-03-12 08:50:01 UTC
Permalink
Post by Maytham Alsudany
Post by Charles Plessy
2. The package maintainer opens an issue with _Review_ template (shall we
just make it default?). Salsa ID pings in the issue can be useful for
exchanging reviews.
3. Once the checklist is clear, the maintainer uses the create merge request
button in the issue page, to add the package in the table below. (Or just
editing the file directly is fine?)
Wouldn't using an issue's open/closed status and tags be sufficient
Yeah I also find multiple issue per package confusing. With one issue
per package, it is easy to add comments to the issue explaining what has
to change, and to discuss those changes. The top-level summary can also
be updated to summarize current status.
Post by Maytham Alsudany
- open = not in the archive yet
- closed = in the archive / abandoned
- tag "passed-review"
- tag "accepted" when the package enters the archive
Right, tags can be used to even further summarize things.
Post by Maytham Alsudany
I also suggest using a standard naming scheme like "package/version"
(e.g. "r-cran-multitaper/1.0-17-1") for easy lookup of packages as well
as different versions (for binNEW).
+1
Post by Maytham Alsudany
Then, if ftp-masters want to check for a peer-review, they can just look
for the name of the package in the list of open issues, named with a
standard "package-name/package-version". 
+1
Post by Maytham Alsudany
And if the table in README is still necessary, then (I think) a CI
pipeline can update it from the issue data.
Doable, but I'm not sure that complexity adds anything.
Post by Maytham Alsudany
Post by Charles Plessy
4. Somebody merges the request after verifying quickly that the checklist was
properly addressed.
5. Reviewers open issues with the _Review_ template. If problems are found,
they ping the maintainer with their salsa ID in the issue discussion. Reviews
end by adding the issue ID in the table below via a MR to be accepted and
merged by the package maintainer.
6. Once all reviewers thumbs are up, update the table below (with or without
MR), and upload to NEW.
7. Once the package leaves the NEW queue, record the outcome in the table below.
Some definition of scope would be useful (e.g. "we don't check if the
program runs or if the package builds (that's the responsibility of the
uploader), we just do the same checks as those that happen in the NEW
queue").
+1 -- that is the responsibility of the main Salsa pipeline, I think.
Post by Maytham Alsudany
Let me know how I can help! I enjoy reviewing copyright files, so if any
arise, please send them my way :)
I hope that we can find some problem to fix in the `litetlog` packaging
when I submit it, arguing for the awesomeness of this effort :)

/Simon
Charles Plessy
2025-03-13 09:10:01 UTC
Permalink
Hi all,

Following Maytham and Simon's feedback, I now propose a workflow that is purely
based on issues. The default template is a checklist to guide the reviews.

<https://salsa.debian.org/newgateway-team/reviews/-/blob/main/.gitlab/issue_templates/Default.md?ref_type=heads>
(please bear in mind that it is work-in-progress)

I think that it is best to have one issue per review, and therefore per
reviewer, as it helps transparency and accountability. If there would be one
log with three persons discussion and rebutals mixed together, I think that it
would be harder to follow.

This said, I think that in most of the cases nobody should have to read the
reviews again: we should aim for packages perfecly clean. In the worst case
scenario, notes to the FTP team should go through the usual channels
(README.source, etc.). We should not ask them to spend time on our experiment.

The package maintainer essentially does the same review as the reviewers, hence
a single default template should be enough. To ease everybody's work, I think
that we should ask that the Salsa CI standard pipeline should pass unless there
is a good reason for failing. Lintian has a bunch of essential checks for
copyright files...

This is not set in stone, and the CI tests are still sparse and fragile (GitLab
CI advices or MRs welcome), but I welcome everybody to have a look and try it.
Mathyam, Simon, I pinged you with your @id on Salsa; please let me know if you
did not get message: it is essential that it is easy for everyone to get that
right.

Have a nice day,

Charles
--
Charles Plessy Nagahama, Yomitan, Okinawa, Japan
Debian Med packaging team http://www.debian.org/devel/debian-med
Tooting from home https://framapiaf.org/@charles_plessy
- You do not have my permission to use this email to train an AI -
Simon Josefsson
2025-03-06 12:30:01 UTC
Permalink
Post by Charles Plessy
Hi Sean and everybody,
Around 12 years ago, I proposed a peer-review system to increase the quality of
the packages in the NEW queue. https://wiki.debian.org/CopyrightReview
I like this idea, as an opt-in service to prepare for ftp-master review.

I've been doing at least 25 NEW uploads for the past months, and
debian/copyright is certainly the biggest manual time consumer for me.

I've learned some tricks to get it right, but I also rely on ftp-masters
to catch me when I miss something.

There are corner-cases where I would like to have a discussion about
some minor aspect, and I've been trying (although not always succeeding)
to not pester ftp-masters with these minor questions.

Having an opt-in service of people who want to perform ftp-master-like
debian/copyright review of a package would be helpful for me. I don't
find posting to debian-legal serving this function, where the advice
received is often neither helpful or actionable. Using debian-legal for
this discussion would be fine, maybe that makes it more contributory.

It would also be good to write down some of the finer rules on some
aspects of debian/copyright, such as how to deal with public domain
contributions, vendored stuff where there is a known copyright holder
but not mentioned in any file, how to deal with non-free DCO-like
statements, etc.

/Simon
Post by Charles Plessy
- a Salsa group into which people fork repos and run CI screens for copyright,
license and missing source issues.
- a peer-review system based on issues or MRs (for instance to a master
repository with a text file tracing the outcome of the reviews).
- as of today people would use it to ensure their submissions to NEW are to
the highest standards and therefore the least likely to waste time of the
FTP team members.
- the outcome of the NEW processing of the peer review is also recorded by
volunteers, allowing to better measure the achievements and usefulness of
the system.
- The FTP team, if they wish, can provide feedback.
- when the FTP team calls for new trainees, applicants who have a track record
of peer reviews in that system can show it to the FTP team, who are free to
do what they want with this information.
- If the FTP team recruits someobody who was peer reviewer and liked that
system, a positive loop is made.
Have a nice day,
Charles
Otto Kekäläinen
2025-03-06 19:10:02 UTC
Permalink
Hi!
Post by Charles Plessy
Around 12 years ago, I proposed a peer-review system to increase the quality of
the packages in the NEW queue. https://wiki.debian.org/CopyrightReview
For packages that I sponsor, I already do reviews of the
debian/copyright and all other files. These are recorded as Merge
Requests in Salsa. Perhaps the easiest way to achieve the workflow you
envision would be to have a field in the upload metadata that links to
the Merge Request on Salsa, so FTP masters can see who reviewed the
contents and if their feedback was properly addressed in addition to
reviewing the uploaded artifacts from scratch?
Pirate Praveen
2025-03-07 08:00:01 UTC
Permalink
Post by Otto Kekäläinen
Hi!
Post by Charles Plessy
Around 12 years ago, I proposed a peer-review system to increase the quality of
the packages in the NEW queue. https://wiki.debian.org/CopyrightReview
For packages that I sponsor, I already do reviews of the
debian/copyright and all other files. These are recorded as Merge
Requests in Salsa. Perhaps the easiest way to achieve the workflow you
envision would be to have a field in the upload metadata that links to
the Merge Request on Salsa, so FTP masters can see who reviewed the
contents and if their feedback was properly addressed in addition to
reviewing the uploaded artifacts from scratch?
May be this can go to changelog? As adding new filed may need tools and
people to adjust and can take a long time.
Simon McVittie
2025-03-07 10:20:01 UTC
Permalink
Post by Pirate Praveen
Post by Otto Kekäläinen
Post by Charles Plessy
Around 12 years ago, I proposed a peer-review system to increase the quality of
the packages in the NEW queue. https://wiki.debian.org/CopyrightReview
For packages that I sponsor, I already do reviews of the
debian/copyright and all other files. These are recorded as Merge
Requests in Salsa. Perhaps the easiest way to achieve the workflow you
envision would be to have a field in the upload metadata that links to
the Merge Request on Salsa, so FTP masters can see who reviewed the
contents and if their feedback was properly addressed in addition to
reviewing the uploaded artifacts from scratch?
May be this can go to changelog? As adding new filed may need tools and
people to adjust and can take a long time.
If the public NEW-queue viewer at https://ftp-master.debian.org/new.html
is an accurate reflection of the files that the ftp team would look at
first in their internal processes, then the top changelog entry (but only
the top changelog entry, and not later ones), debian/README.source, or
the copyright file itself would be the places to put evidence supporting
the copyright file being correct.

A change history of problems that were reported and fixed doesn't seem
like something that would speed up the ftp team's work: if they feel that
they have to review a change history *in addition* to reviewing the uploaded
artifacts, I don't see how that would take a shorter time than only
reviewing the uploaded artifacts. The only way this could help is if the
ftp team were willing to trust the information from peer review and do
a less in-depth review of packages that have had a positive peer review,
but I have not seen any indication from the ftp team that they would be
prepared to do that.

So I think it could be better to frame this in terms of finding a good
place to put supporting evidence ("I know the licensing situation
in contrib/foo/ looks strange at first glance, but in fact it's OK
because..."), rather than somewhere to put a change history of previous
negative feedback being addressed. The ftp team don't need to know about
the existence of previous, wrong packages, they are only approving or
rejecting the hopefully-correct final package that has been submitted
for their review.

smcv
Otto Kekäläinen
2025-03-07 15:00:01 UTC
Permalink
How about adding a new header field in debian/copyright
(https://dep-team.pages.debian.net/deps/dep5/) called something like
"Reviews" which would be a list of URLs pointing to whatever public
system was used to record a review?

Then whoever reviews the debian/copyright file has easy access to
reviews the package maintainer recorded there.
Sean Whitton
2025-03-09 05:20:01 UTC
Permalink
Hello,
Post by Simon McVittie
If the public NEW-queue viewer at https://ftp-master.debian.org/new.html
is an accurate reflection of the files that the ftp team would look at
first in their internal processes, then the top changelog entry (but only
the top changelog entry, and not later ones), debian/README.source, or
the copyright file itself would be the places to put evidence supporting
the copyright file being correct.
Just to note that this order is not the order in which they get
presented to us. Instead, there's some logic in 'dak process-new' to
try to sort them helpfully.

It's got some issues, like if a package has a note attached to it then
it gets sorted last, the idea being that the person who left the note
hopefully will get prompted to look at it again in response to an e-mail
from the uploader. That causes things to get stuck at the bottom for
ages, though.

There's command line options to change the order a bit; I think
basically we can choose whether or not binNEW packages get sorted first.
I have that turned off in my shell aliases; I don't know about other
team members.
Post by Simon McVittie
A change history of problems that were reported and fixed doesn't seem
like something that would speed up the ftp team's work: if they feel that
they have to review a change history *in addition* to reviewing the uploaded
artifacts, I don't see how that would take a shorter time than only
reviewing the uploaded artifacts. The only way this could help is if the
ftp team were willing to trust the information from peer review and do
a less in-depth review of packages that have had a positive peer review,
but I have not seen any indication from the ftp team that they would be
prepared to do that.
Yes.
Post by Simon McVittie
So I think it could be better to frame this in terms of finding a good
place to put supporting evidence ("I know the licensing situation
in contrib/foo/ looks strange at first glance, but in fact it's OK
because..."), rather than somewhere to put a change history of previous
negative feedback being addressed. The ftp team don't need to know about
the existence of previous, wrong packages, they are only approving or
rejecting the hopefully-correct final package that has been submitted
for their review.
Comments in d/copyright or d/changelog help.
--
Sean Whitton
Marc Haber
2025-03-06 08:10:03 UTC
Permalink
Post by Sean Whitton
Post by Nilesh Patra
Do you mind clarifying why that's the case, unless the reason is truly
personal or undisclosable?
It's pretty simple -- there is no-one with the free time to train them
right now, in which case trainees will simply burn out, because they
won't get enough feedback on their NEW reviews.
This sounds like a self-amplifying situation. The project should do
something against that, especially in such a core position.

I thank the DPL for putting this to public attention.

Greetings
Marc. I'll take my Popcorn with salt please.
--
-----------------------------------------------------------------------------
Marc Haber | "I don't trust Computers. They | Mailadresse im Header
Leimen, Germany | lose things." Winona Ryder | Fon: *49 6224 1600402
Nordisch by Nature | How to make an American Quilt | Fax: *49 6224 1600421
Holger Levsen
2025-03-06 08:40:02 UTC
Permalink
Post by Marc Haber
Marc. I'll take my Popcorn with salt please.
yeah, it's pretty funny to see a team burn out and have the same silly
& salty discussion about this again and again.

or maybe not.

also talking about how NEW is a bottleneck will be really motivating
to the person how has been doing most of NEW processing in the last
months.

or maybe not.

I'm not sure about you, but just last week I got a package through NEW
in a few hours. my rust upload folder also told me we (mostly sequoia,
but also rebuilderd) go 86 packages through NEW in the last 15 months,
which means more than one package per week. and that's just our little
corner here.

I do have some complaints about the ftp team, as I have about many things,
*but* I also have tremendous respect for their work. and I know, you might
have forgotten but it was discussed on d-d-a iirc, that several improvements
are being worked on right, and that's not only tag2upload.

so if the/a team says they cannot handle new members right now and thus
there should be no big announcement asking for new members, I very much
think this should be respected and not be ignored and spread on our
most visible mailing list, where there pain will be consumed with popcorn.
--
cheers,
Holger

⢀⣎⠟⠻⢶⣊⠀
⣟⠁⢠⠒⠀⣿⡁ holger@(debian|reproducible-builds|layer-acht).org
⢿⡄⠘⠷⠚⠋⠀ OpenPGP: B8BF54137B09D35CF026FE9D 091AB856069AAA1C
⠈⠳⣄

war is peace. freedom is slavery. ignorance is strength. infection is health.
Marc Haber
2025-03-06 09:00:01 UTC
Permalink
Post by Holger Levsen
Post by Marc Haber
Marc. I'll take my Popcorn with salt please.
yeah, it's pretty funny to see a team burn out and have the same silly
& salty discussion about this again and again.
I apologize for trying to bring a smile into a heated discussion and
will now return to shutting up to non-technical issues.
--
-----------------------------------------------------------------------------
Marc Haber | "I don't trust Computers. They | Mailadresse im Header
Leimen, Germany | lose things." Winona Ryder | Fon: *49 6224 1600402
Nordisch by Nature | How to make an American Quilt | Fax: *49 6224 1600421
Matthias Urlichs
2025-03-06 09:10:02 UTC
Permalink
Post by Marc Haber
I apologize for trying to bring a smile into a heated discussion
Thank you.

--
-- regards
--
-- Matthias Urlichs
Soren Stoutner
2025-03-07 23:30:01 UTC
Permalink
Post by Marc Haber
Post by Holger Levsen
Post by Marc Haber
Marc. I'll take my Popcorn with salt please.
yeah, it's pretty funny to see a team burn out and have the same silly
& salty discussion about this again and again.
I apologize for trying to bring a smile into a heated discussion and
will now return to shutting up to non-technical issues.
I appreciated the humor.

Sometimes humor is designed to pick on people or make someone feel bad. I don’t
appreciate that type of humor.

Other times humor is designed to lighten a tense conversation in a way that helps
everyone relax and realize that maybe they could discuss a serious topic more
productively if everyone wasn’t so tense about it. I find this type of humor to be pleasant
and helpful.

I felt Marc’s comment was the second. YMMV. :)
--
Soren Stoutner
***@debian.org
Holger Levsen
2025-03-06 10:10:01 UTC
Permalink
Post by Holger Levsen
so if the/a team says they cannot handle new members right now and thus
there should be no big announcement asking for new members, I very much
think this should be respected and not be ignored and spread on our
most visible mailing list, where there pain will be consumed with popcorn.
This has been an long term recurring complaint without any tangible solution
or a plan from the concerned team, so I think it is important for the whole
project to address it, and not just leave it to the ftp team to resolve it
(they ave not been able to address it by themselves for a long time).
yes, maybe, probably.

but the way it's been done here currently is utterly disrespectful und hardly
helpful at all.
--
cheers,
Holger

⢀⣎⠟⠻⢶⣊⠀
⣟⠁⢠⠒⠀⣿⡁ holger@(debian|reproducible-builds|layer-acht).org
⢿⡄⠘⠷⠚⠋⠀ OpenPGP: B8BF54137B09D35CF026FE9D 091AB856069AAA1C
⠈⠳⣄

There are no jobs on a dead planet.
Matthias Urlichs
2025-03-06 09:10:01 UTC
Permalink
Post by Marc Haber
I thank the DPL for putting this to public attention.
Well OK but mayybe he should have handled this a bit more diplomatically.

Or maybe he tried to, and failed to get traction. I assume he'll tell us
presently, if only to reduce the popcorn-to-serious-discussion ratio.
Post by Marc Haber
Greetings
Marc. I'll take my Popcorn with salt please.
Please don't. We'll get enough of this sort of remark from LWN soon
enough; treating peoples' honest concerns (not to mention their actual
work for the project) as entertainment on-list is disingenious.

--
-- regards
--
-- Matthias Urlichs
Pierre-Elliott Bécue
2025-03-10 11:00:01 UTC
Permalink
Hello,
Post by Sean Whitton
Hello,
Post by Nilesh Patra
Do you mind clarifying why that's the case, unless the reason is truly
personal or undisclosable?
It's pretty simple -- there is no-one with the free time to train them
right now, in which case trainees will simply burn out, because they
won't get enough feedback on their NEW reviews.
We try to recruit only when there is someone who is able to dedicate
some time to training. That depends on what the other team members are
busy with, in and outside of Debian, at a given time.
This was made very clear to Andreas.
There is a risk for such a situation to become auto-feeded.

I acknowledge the position of the ftpmasters team, but do you have a
plan to avoid it becoming a vicious circle?

Bests,
--
PEB
Leandro Cunha
2025-03-05 19:10:02 UTC
Permalink
Hello everyone,
Dear Debian community,
this is bits from DPL for February.
Ftpmaster team is seeking for new team members
==============================================
No, we are not.
Andreas asked us whether we would like a call for volunteers included in
Bits. Multiple team members explicitly told him that we now would not
be a good time for that, for us.
For the FTP team,
--
Sean Whitton
Regarding this, when would be a good time? I always see the new queue
full of packages for approval by your team and we are grateful when
they approve the packages quickly. But at the moment there are no FTP
Trainees and wouldn't it be interesting to call them (whoever is
interested, of course)?

[1] https://ftp-master.debian.org/
--
Cheers,
Leandro Cunha
Matthias Urlichs
2025-03-05 22:30:01 UTC
Permalink
Ftpmaster team is seeking for new team members
==============================================
No, we are not.
The NEW queue currently contains ~135 packages. The median wait time on
the list(*) is three weeks, and the oldest packages have been, well,
languishing, for nine months or so.

(*) Yes I know that this may well be an inflated median: after all, the
packages which ftpmaster *did* process lately are not on the list by
definition. However, that's still 50 people who've been waiting for at
least a month to get their package into Debian.

Of the ITP bugs I spot-checked randomly, none contained a hint why the
package was not yet processed.

If no current team member has free time for pruning this list, adding
new members is the obvious solution. While we all know that bringing new
people up to speed eats time too, not fixing the roof because you're too
busy emptying buckets is not a viable long-term strategy.

If you have a better idea how to improve the situation, by all means
let's hear it.

NB: "now would not be a good time" begs the question how long you expect
said "now" to last.

--
-- regards
--
-- Matthias Urlichs
Timo Röhling
2025-03-06 10:30:01 UTC
Permalink
Hi,
Post by Matthias Urlichs
The NEW queue currently contains ~135 packages. The median wait time on
the list(*) is three weeks, and the oldest packages have been, well,
languishing, for nine months or so.
(*) Yes I know that this may well be an inflated median: after all,
the packages which ftpmaster *did* process lately are not on the list
by definition. However, that's still 50 people who've been waiting for
at least a month to get their package into Debian.
I'm not an FTP Team member, but I happen to have analyzed exactly this
question in detail [1]. The FTP team is very transparent in this regard
and provides all processing logs, so any DD can verify this for
themselves.

In summary, the median wait time in NEW is currently less than 48 hours,
and in the last 10 years it was seldomly longer than a week. 90 percent
of all packages going through NEW are processed within a few weeks. Only
2 percent of all packages going through NEW are held up for several
months or longer.

A typical months sees about 400 packages going through NEW, and up to
twice that many in the month or two directly after a release, when
maintainers rush to catch up with upstream releases or introduce new
stuff for the next release cycle.

That means that in an average month, more than 200 packages pass through
NEW within two days, and only about 20 packages get stuck for more than
three or four weeks.

So why do people feel NEW processing is generally slow? I have a few
ideas:

1. People looking at the current state of the NEW queue easily fall prey
to survivorship bias; they see mostly the problematic cases and almost
none of the simple ones.

2. Source packages going through NEW merely because they introduce new
binary packages are typically processed faster than completely new ones.
Maintainers for C/C++ libraries, which need to go through NEW on a semi
regular basis, tend to have a much smoother overall experience than,
say, Python maintainers, who only interact with NEW when they introduce
new packages.

3. Source packages have varying degrees of complexity for
debian/copyright, which is not necessarily the maintainers fault (some
upstreams seem to treat code with various licenses as some sort of
Pokemon-style collection challenge), but the maintainer has to deal with
it in a way that the FTP team can sign off on it. And that may take some
time (on both ends).

4. There is a certain variability in processing times which naturally
comes from the fact that everything we do is volunteer work, which is
totally out of the maintainer's control. I've seen packages pass through
NEW within hours, one time even in less than 60 minutes(!), and I've
waited on a similar one for weeks.

The difficulty to know how long the trip through NEW will take has a
significant psychological impact. Close to my home, there is a railway
crossing on a relatively busy track. If the barriers come down, it can
mean a wait time from a minute (a single train) up to 20(!) minutes
(with several and/or long trains in close sucession). This does not
happen very often, but you have no way of knowing in advance. Thus,
people take significant detours to avoid that level crossing, as they'd
rather add five minutes for certain to their trip than roll the dice for
an unlucky quarter hour.


Cheers
Timo

[1] https://people.debian.org/~roehling/new_queue/
--
⢀⣎⠟⠻⢶⣊⠀ ╭────────────────────────────────────────────────────╮
⣟⠁⢠⠒⠀⣿⡁ │ Timo Röhling │
⢿⡄⠘⠷⠚⠋⠀ │ 9B03 EBB9 8300 DF97 C2B1 23BF CC8C 6BDD 1403 F4CA │
⠈⠳⣄⠀⠀⠀⠀ ╰────────────────────────────────────────────────────╯
Faidon Liambotis
2025-03-06 13:50:03 UTC
Permalink
Post by Timo Röhling
I'm not an FTP Team member, but I happen to have analyzed exactly this
question in detail [1]. The FTP team is very transparent in this regard and
provides all processing logs, so any DD can verify this for themselves.
Thank you for this analysis and for providing both data as well as the
code to produce it, that's super helpful!
Post by Timo Röhling
In summary, the median wait time in NEW is currently less than 48 hours, and
in the last 10 years it was seldomly longer than a week. 90 percent of all
packages going through NEW are processed within a few weeks. Only 2 percent
of all packages going through NEW are held up for several months or longer.
This analysis does not/cannot account for the distinction of the truly
"new to the archive" packages, vs. package renames/SONAME bumps etc. It
is also my hypothesis, as you also identified with your later point (2),
that there are significant differences between the time it takes to
complete the two different types of reviews.

While it's totally understandable why the data is presented this way,
it's worth perhaps keeping this limitation in mind and applying some
caution when trying to interpret the data, as median/90p/98p may not
tell the right story in what what may well be a bimodal distribution.
Post by Timo Röhling
That means that in an average month, more than 200 packages pass through NEW
within two days, and only about 20 packages get stuck for more than three or
four weeks.
While I agree with basically all your hypotheses (and thank you for so
eloquently making them), I think there is an additional reason, that
challenges the framing of the question itself. Your question perhaps
incorporates an underlying assumption that "two days", or "[less] than
three or four weeks" is not slow.

I'd like to challenge this assumption.

I can ship code from a VCS host, for free, in a few seconds. Heck, I can
even ship code from a debian.org domain, from a shared "debian"
namespace, in the same amount of time. Salsa admins are not approving
every new repository by hand, and it would be preposterous for anyone to
even suggest doing that.

So why is this different?

One could of course say that we, as a project, offer our main archive as
a more trustworthy and curated set of software than say, e.g.a random
personal GitHub repository, or even the Salsa "debian" namespace. But,
in turn, this makes the assumption that the only way to legally
distribute software of a certain quality and trustworthiness is by
having a team of 5-10 people review it all by hand. This is, IMHO, a
flawed and outdated view. I believe this practice has not historically
scaled alongside the growth of the project, but also _cannot
fundamentally scale_ at the pace of modern software development and the
expectations that many of us have.

The way I see it, gatekeeping in the way the NEW system works was very
common in the 90s and early 2000s, with software development patterns
where development teams authored software, and another set of "more
privileged" teams (whether these were "operations", "release
engineering", "delivery", a "CAB" or "change control board" etc.), would
manually review, approve and ship said software. A certain amount of lag
was expected, and often built into the system. When the next release was
going to be shipped in CDs three years from the time the code was
written, waiting a couple of weeks was kind of OK?

In my experience, these methodologies have fallen out of practice,
through various movements (code review culture, DevOps, "lean" and
"agile" software development, "shift left"), associated tooling (the
rise of VCS & DVCS like git, code review & CI/CD platforms like GitHub
and GitLab, ...) or... even the availability of internet itself.
Basically, collaborative software development has come a long way in the
past ten or twenty years.

That's not to say that these practices do not still exist in certain
sectors/industries, nor that there isn't a lot of grey in between these
two worlds. But I hope we can all agree that the expectations a modern
software developer has for how quickly it will take for their code to
reach its intended audience, has *radically* changed over the past two
decades.

"A few weeks" simply does not cut it anymore -- the average DD would
likely revolt if they had to wait for a manual review of a few days or
weeks for every Debian upload of theirs, or for every testing migration.
We only tolerate it for NEW because most of us have to rarely go through
it.

With all that said, I'd like to say that I am immensely thankful to
these 5-10 people for their work and what is legitimately a lot of work
we all benefit from. I also understand that it's hard to contemplate
*any* change when you are overworked to the point of burning out, and
especially when you feel your work is thankless and unappreciated.

But, many of us desire more _fundamental_ changes in this space and have
been raising this point for years. I personally have felt like stuck
between a rock (status quo) and a hard place (sounding thankless to an
overworked team of volunteers) for more than a decade. So while the
reasons may be understandable, it's especially saddening to hear that
the team does not even want to contemplate adding new members to its
ranks.

So I'd like to ask the ftp-master team in particular: what would you
suggest is the best way to approach your team in collaboratively
evolving and improving the way NEW works? How can the project, either
through its DPL, or as individual members desiring such larger systemic
changes, convince you for the necessity of making said changes, and
ultimately help you in implementing them?

With gratitude,
Faidon
Wookey
2025-03-11 12:50:01 UTC
Permalink
Post by Faidon Liambotis
Your question perhaps
incorporates an underlying assumption that "two days", or "[less] than
three or four weeks" is not slow.
I'd like to challenge this assumption.
I can ship code from a VCS host, for free, in a few seconds. Heck, I can
even ship code from a debian.org domain, from a shared "debian"
namespace, in the same amount of time. Salsa admins are not approving
every new repository by hand, and it would be preposterous for anyone to
even suggest doing that.
<snip>
Post by Faidon Liambotis
But, many of us desire more _fundamental_ changes in this space and have
been raising this point for years. I personally have felt like stuck
between a rock (status quo) and a hard place (sounding thankless to an
overworked team of volunteers) for more than a decade....
So I'd like to ask the ftp-master team in particular: what would you
suggest is the best way to approach your team in collaboratively
evolving and improving the way NEW works? How can the project, either
through its DPL, or as individual members desiring such larger systemic
changes, convince you for the necessity of making said changes, and
ultimately help you in implementing them?
This bit of the thread hasn't got any reaction/traction yet, which surprises me slightly.

Do we still even _need_ to pre-review the archive the same way we have
been for 30 years? Could not post-review when actual problems are
noted be sufficient (given that much of the rest of the ecosystem
seems to manage this, although a lot of that is source rather than
binaries).

I know this has been discussed before, but it seems to me that this is
something worth reviewing, because NEW reviewing is a big pile of work
and additional friction, and if we _could_ just do less of it, that would be good.

Wookey
--
Principal hats: Debian, Wookware
http://wookware.org/
Jeremy Stanley
2025-03-11 14:40:01 UTC
Permalink
On 2025-03-11 18:52:07 +0530 (+0530), Pirate Praveen wrote:
[...]
I think in previous discussions, it was suggested to pay for a proper
legal opinion, may be from SFC or SFLC. I think this would be a good
use of Debian's money.
With a proper legal opinion, we will be in a much better position to
evaluate changes to these processes.
SPI has relationships with some great IP lawyers specializing in
F/LOSS licenses and community-run projects, and they charge us very
reasonable rates. All it takes is a clear list of questions and
authorization from Debian leadership for us to engage with counsel
to get answers. Turn-around time is typically somewhere between a
week and a month depending on their availability, and whether the
specific questions necessitate a referral to other colleagues with
slightly different specializations.
--
Jeremy Stanley
Timo Röhling
2025-03-11 15:40:01 UTC
Permalink
Hi,
I think in previous discussions, it was suggested to pay for a proper
legal opinion, may be from SFC or SFLC. I think this would be a good
use of Debian's money.
With a proper legal opinion, we will be in a much better position to
evaluate changes to these processes.
That depends on your expectations. Making any process legally bullet
proof is like fixing all the security vulnerabilities in a software
package.

It would be interesting to know if we are currently overspending or
underspending on risk mitigation (in terms of time and money). A legal
opinion will be helpful to inform our discussion, but it will not be a
substitute for consensus on our collective risk appetite, i.e., how much
legal exposure we deem acceptable for Getting Things Done.

Cheers
Timo
--
⢀⣎⠟⠻⢶⣊⠀ ╭────────────────────────────────────────────────────╮
⣟⠁⢠⠒⠀⣿⡁ │ Timo Röhling │
⢿⡄⠘⠷⠚⠋⠀ │ 9B03 EBB9 8300 DF97 C2B1 23BF CC8C 6BDD 1403 F4CA │
⠈⠳⣄⠀⠀⠀⠀ ╰────────────────────────────────────────────────────╯
Simon Josefsson
2025-03-07 17:20:01 UTC
Permalink
Your graph and statistics on this is great, thank you!
Post by Timo Röhling
2. Source packages going through NEW merely because they introduce new
binary packages are typically processed faster than completely new ones.
Good point. Therefore, I think your graph gives a biased view for
anyone who thinks of NEW processing time to be the same as processing
time to add a new source package to the archive.

Is it possible from your data sources to filter these two cases apart?

That is, to get a graph showing the processing time in NEW for adding a
new source package.

Maybe I've just been unlucky with my new source package uploads, but a
48 hours median doesn't match my experience uploading new source
packages for the last 6 months. I would guess a median of say 10 days
for my uploads (with exceptions down to hours and 3+ months), which is
impressingly quick interrupt-based volunteer time (thank you!) but
enough different from your conclusion that I suspect there is some bias.
Post by Timo Röhling
The difficulty to know how long the trip through NEW will take has a
significant psychological impact. Close to my home, there is a railway
crossing on a relatively busy track. If the barriers come down, it can
mean a wait time from a minute (a single train) up to 20(!) minutes
(with several and/or long trains in close sucession). This does not
happen very often, but you have no way of knowing in advance. Thus,
people take significant detours to avoid that level crossing, as
they'd rather add five minutes for certain to their trip than roll the
dice for an unlucky quarter hour.
Yay, thanks for this analogy! It helps to explain that not everything
is captured by simple statistics.

/Simon
Timo Röhling
2025-03-08 23:20:01 UTC
Permalink
Hi Simon,
Post by Simon Josefsson
Is it possible from your data sources to filter these two cases apart?
It is not explicitly recorded, but I can deduce it from the data, as I
have the name of the .changes file and can take everything before the
first underscore (_) as source package name.

For the sake of simplicity, I did not split the dataset into monthly
chunks. Instead, I binned the processing times by four mutually
exclusive outcomes. So, without further ado, these are the percentiles
for all uploads to NEW from September 2012 [1] until January 2025:


33743 ACCEPTs
50% - 4 days, 18:10:30
90% - 42 days, 3:26:44
98% - 106 days, 12:47:56

24443 ACCEPTs (binNEW)
50% - 2 days, 1:25:25
90% - 13 days, 23:44:49
98% - 67 days, 23:07:27

6318 REJECTs
50% - 8 days, 4:03:34
90% - 98 days, 16:03:15
98% - 267 days, 4:23:37

1712 REJECTs (binNEW)
50% - 21:28:34
90% - 43 days, 0:35:03
98% - 173 days, 1:30:30

I'm pretty sure that you can fit exponential probability distributions
on these, but that is work for another day.

Cheers
Timo


[1] In case you are wondering what the significance of that date is, it
is when the dak log files changed to the current format, and I was too
lazy to implement parsing support for the older ones. It also means
there are a few false negatives for my detection of binNEW uploads, but
I doubt it changes the results by much.
--
⢀⣎⠟⠻⢶⣊⠀ ╭────────────────────────────────────────────────────╮
⣟⠁⢠⠒⠀⣿⡁ │ Timo Röhling │
⢿⡄⠘⠷⠚⠋⠀ │ 9B03 EBB9 8300 DF97 C2B1 23BF CC8C 6BDD 1403 F4CA │
⠈⠳⣄⠀⠀⠀⠀ ╰────────────────────────────────────────────────────╯
Julien Plissonneau Duquène
2025-03-09 11:20:01 UTC
Permalink
Hi,

Thank you Timo for these statistics.
Over that period we get a 15.3% reject rate (nearly 1 in 6) for
non-binNEW (6.6% for binNew, about 1 in 16) which is significant. The
fact that the decision delays more than double in the case of rejects is
possibly telling that it takes much more work to reject a package than
to accept it (or that packages that are complicated to review are more
likely to be rejected).

It would be interesting to check if the recent trend (e.g. last 2 or 3
years) is similar. If so, it's certainly worth the trouble to try to
minimize the rejection rate to reduce the ftp masters workload and their
processing delays. Having new packages sponsored and thoroughly reviewed
by another DD could help to weed out common causes of rejection, and
maybe in compensation if we don't want to make this mandatory for all
packages the sponsored packages could be processed in priority by the
ftp masters.

Cheers,
--
Julien Plissonneau Duquène
Sean Whitton
2025-03-09 05:10:02 UTC
Permalink
Hello,
Post by Simon Josefsson
Your graph and statistics on this is great, thank you!
Post by Timo Röhling
2. Source packages going through NEW merely because they introduce new
binary packages are typically processed faster than completely new ones.
Good point. Therefore, I think your graph gives a biased view for
anyone who thinks of NEW processing time to be the same as processing
time to add a new source package to the archive.
Just to note that per the FTP team docs[1] we perform a full copyright
and license review even if it's just a SONAME bump. I do not think we
should be doing this, but it's the team policy.

[1] https://salsa.debian.org/ftp-team/manpages
--
Sean Whitton
Julien Plissonneau Duquène
2025-03-09 11:20:01 UTC
Permalink
Hi,
Post by Sean Whitton
Just to note that per the FTP team docs[1] we perform a full copyright
and license review even if it's just a SONAME bump. I do not think we
should be doing this, but it's the team policy.
It makes sense in a way, as it makes the process more consistent across
packages and reviewers, and also as upstream projects are evolving, as
well as Debian policies and guidelines.

What may help here is a team policy adjustment where recently reviewed
packages could skip the full process if the change set from the last
reviewed version is reasonably small, and tools that can remember and
identify which parts of the upstream project were already reviewed and
when for the last time and eventually annotate them. But I'm not sure
there is much to gain here overall.

Cheers,
--
Julien Plissonneau Duquène
Sean Whitton
2025-03-09 05:10:02 UTC
Permalink
Hello,

Thank you, Timo, for all the info. I think you're quite right about the
psychological impacts and the comparison with the level crossing is apt.
--
Sean Whitton
Loading...