Coverage for dak/process_upload.py: 82%

324 statements  

« prev     ^ index     » next       coverage.py v7.6.0, created at 2026-03-14 12:19 +0000

1#! /usr/bin/env python3 

2 

3""" 

4Checks Debian packages from Incoming 

5@contact: Debian FTP Master <ftpmaster@debian.org> 

6@copyright: 2000, 2001, 2002, 2003, 2004, 2005, 2006 James Troup <james@nocrew.org> 

7@copyright: 2009 Joerg Jaspert <joerg@debian.org> 

8@copyright: 2009 Mark Hymers <mhy@debian.org> 

9@copyright: 2009 Frank Lichtenheld <djpig@debian.org> 

10@license: GNU General Public License version 2 or later 

11""" 

12 

13# This program is free software; you can redistribute it and/or modify 

14# it under the terms of the GNU General Public License as published by 

15# the Free Software Foundation; either version 2 of the License, or 

16# (at your option) any later version. 

17 

18# This program is distributed in the hope that it will be useful, 

19# but WITHOUT ANY WARRANTY; without even the implied warranty of 

20# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the 

21# GNU General Public License for more details. 

22 

23# You should have received a copy of the GNU General Public License 

24# along with this program; if not, write to the Free Software 

25# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA 

26 

27# based on process-unchecked and process-accepted 

28 

29## pu|pa: locking (daily.lock) 

30## pu|pa: parse arguments -> list of changes files 

31## pa: initialize urgency log 

32## pu|pa: sort changes list 

33 

34## foreach changes: 

35### pa: load dak file 

36## pu: copy CHG to tempdir 

37## pu: check CHG signature 

38## pu: parse changes file 

39## pu: checks: 

40## pu: check distribution (mappings, rejects) 

41## pu: copy FILES to tempdir 

42## pu: check whether CHG already exists in CopyChanges 

43## pu: check whether FILES already exist in one of the policy queues 

44## for deb in FILES: 

45## pu: extract control information 

46## pu: various checks on control information 

47## pu|pa: search for source (in CHG, projectb, policy queues) 

48## pu|pa: check whether "Version" fulfills target suite requirements/suite propagation 

49## pu|pa: check whether deb already exists in the pool 

50## for src in FILES: 

51## pu: various checks on filenames and CHG consistency 

52## pu: if isdsc: check signature 

53## for file in FILES: 

54## pu: various checks 

55## pu: NEW? 

56## //pu: check whether file already exists in the pool 

57## pu: store what "Component" the package is currently in 

58## pu: check whether we found everything we were looking for in CHG 

59## pu: check the DSC: 

60## pu: check whether we need and have ONE DSC 

61## pu: parse the DSC 

62## pu: various checks //maybe drop some of the in favor of lintian 

63## pu|pa: check whether "Version" fulfills target suite requirements/suite propagation 

64## pu: check whether DSC_FILES is consistent with "Format" 

65## for src in DSC_FILES: 

66## pu|pa: check whether file already exists in the pool (with special handling for .orig.tar.gz) 

67## pu: create new tempdir 

68## pu: create symlink mirror of source 

69## pu: unpack source 

70## pu: extract changelog information for BTS 

71## //pu: create missing .orig symlink 

72## pu: check with lintian 

73## for file in FILES: 

74## pu: check checksums and sizes 

75## for file in DSC_FILES: 

76## pu: check checksums and sizes 

77## pu: CHG: check urgency 

78## for deb in FILES: 

79## pu: extract contents list and check for dubious timestamps 

80## pu: check that the uploader is actually allowed to upload the package 

81### pa: install: 

82### if stable_install: 

83### pa: remove from p-u 

84### pa: add to stable 

85### pa: move CHG to morgue 

86### pa: append data to ChangeLog 

87### pa: send mail 

88### pa: remove .dak file 

89### else: 

90### pa: add dsc to db: 

91### for file in DSC_FILES: 

92### pa: add file to file 

93### pa: add file to dsc_files 

94### pa: create source entry 

95### pa: update source associations 

96### pa: update src_uploaders 

97### for deb in FILES: 

98### pa: add deb to db: 

99### pa: add file to file 

100### pa: find source entry 

101### pa: create binaries entry 

102### pa: update binary associations 

103### pa: .orig component move 

104### pa: move files to pool 

105### pa: save CHG 

106### pa: move CHG to done/ 

107### pa: change entry in queue_build 

108## pu: use dispatch table to choose target queue: 

109## if NEW: 

110## pu: write .dak file 

111## pu: move to NEW 

112## pu: send mail 

113## elsif AUTOBYHAND: 

114## pu: run autobyhand script 

115## pu: if stuff left, do byhand or accept 

116## elsif targetqueue in (oldstable, stable, embargo, unembargo): 

117## pu: write .dak file 

118## pu: check overrides 

119## pu: move to queue 

120## pu: send mail 

121## else: 

122## pu: write .dak file 

123## pu: move to ACCEPTED 

124## pu: send mails 

125## pu: create files for BTS 

126## pu: create entry in queue_build 

127## pu: check overrides 

128 

129# Integrity checks 

130## GPG 

131## Parsing changes (check for duplicates) 

132## Parse dsc 

133## file list checks 

134 

135# New check layout (TODO: Implement) 

136## Permission checks 

137### suite mappings 

138### ACLs 

139### version checks (suite) 

140### override checks 

141 

142## Source checks 

143### copy orig 

144### unpack 

145### BTS changelog 

146### src contents 

147### lintian 

148### urgency log 

149 

150## Binary checks 

151### timestamps 

152### control checks 

153### src relation check 

154### contents 

155 

156## Database insertion (? copy from stuff) 

157### BYHAND / NEW / Policy queues 

158### Pool 

159 

160## Queue builds 

161 

162import datetime 

163import errno 

164import fcntl 

165import functools 

166import os 

167import random 

168import sys 

169import time 

170import traceback 

171from collections.abc import Callable, Iterable 

172from typing import Concatenate, NoReturn 

173 

174import apt_pkg 

175 

176import daklib.announce 

177import daklib.archive 

178import daklib.checks 

179import daklib.upload 

180import daklib.utils as utils 

181from daklib import daklog 

182from daklib.config import Config 

183from daklib.dbconn import DBConn, Keyring, SignatureHistory 

184from daklib.regexes import re_default_answer 

185from daklib.summarystats import SummaryStats 

186from daklib.urgencylog import UrgencyLog 

187 

188############################################################################### 

189 

190Options: apt_pkg.Configuration 

191Logger: daklog.Logger 

192 

193############################################################################### 

194 

195 

196def usage(exit_code=0) -> NoReturn: 

197 print( 

198 """Usage: dak process-upload [OPTION]... [CHANGES]... 

199 -a, --automatic automatic run 

200 -d, --directory <DIR> process uploads in <DIR> 

201 -h, --help show this help and exit. 

202 --max-duration <D> stop processing after duration (e.g. 10m, 1h 5m) 

203 -n, --no-action don't do anything 

204 -p, --no-lock don't check lockfile !! for cron.daily only !! 

205 -s, --no-mail don't send any mail 

206 -V, --version display the version number and exit""" 

207 ) 

208 sys.exit(exit_code) 

209 

210 

211############################################################################### 

212 

213type Handler[**P, R] = Callable[Concatenate[str, daklib.archive.ArchiveUpload, P], R] 

214 

215 

216def try_or_reject[**P, R](function: Handler[P, R]) -> Handler[P, R]: 

217 """Try to call function or reject the upload if that fails""" 

218 

219 @functools.wraps(function) 

220 def wrapper(directory: str, upload: daklib.archive.ArchiveUpload, *args, **kwargs): 

221 reason = "No exception caught. This should not happen." 

222 

223 try: 

224 return function(directory, upload, *args, **kwargs) 

225 except (daklib.archive.ArchiveException, daklib.checks.Reject) as e: 

226 reason = str(e) 

227 except Exception: 

228 reason = "There was an uncaught exception when processing your upload:\n{0}\nAny original reject reason follows below.".format( 

229 traceback.format_exc() 

230 ) 

231 

232 try: 

233 upload.rollback() 

234 return real_reject(directory, upload, reason=reason) 

235 except Exception: 

236 reason = "In addition there was an exception when rejecting the package:\n{0}\nPrevious reasons:\n{1}".format( 

237 traceback.format_exc(), reason 

238 ) 

239 upload.rollback() 

240 return real_reject(directory, upload, reason=reason, notify=False) 

241 

242 raise Exception( 

243 "Rejecting upload failed after multiple tries. Giving up. Last reason:\n{0}".format( 

244 reason 

245 ) 

246 ) 

247 

248 return wrapper 

249 

250 

251def get_processed_upload( 

252 upload: daklib.archive.ArchiveUpload, 

253) -> daklib.announce.ProcessedUpload: 

254 changes = upload.changes 

255 control = upload.changes.changes 

256 

257 pu = daklib.announce.ProcessedUpload() 

258 

259 pu.maintainer = control.get("Maintainer") 

260 pu.changed_by = control.get("Changed-By") 

261 pu.fingerprint = changes.primary_fingerprint 

262 pu.authorized_by_fingerprint = upload.authorized_by_fingerprint.fingerprint 

263 

264 pu.suites = upload.final_suites or [] 

265 pu.from_policy_suites = [] 

266 

267 with open(upload.changes.path, "r") as fd: 

268 pu.changes = fd.read() 

269 pu.changes_filename = upload.changes.filename 

270 pu.sourceful = upload.changes.sourceful 

271 pu.source = control.get("Source") 

272 pu.version = control.get("Version") 

273 pu.architecture = control.get("Architecture") 

274 pu.bugs = changes.closed_bugs 

275 

276 pu.program = "process-upload" 

277 

278 pu.warnings = upload.warnings 

279 

280 return pu 

281 

282 

283@try_or_reject 

284def accept(directory: str, upload: daklib.archive.ArchiveUpload) -> None: 

285 cnf = Config() 

286 

287 Logger.log(["ACCEPT", upload.changes.filename]) 

288 print("ACCEPT") 

289 

290 upload.install() 

291 utils.process_buildinfos( 

292 upload.directory, upload.changes.buildinfo_files, upload.transaction.fs, Logger 

293 ) 

294 

295 assert upload.final_suites is not None 

296 accepted_to_real_suite = any( 

297 suite.policy_queue is None for suite in upload.final_suites 

298 ) 

299 sourceful_upload = upload.changes.sourceful 

300 

301 control = upload.changes.changes 

302 if sourceful_upload and not Options["No-Action"]: 

303 urgency = control.get("Urgency") 

304 # As per policy 5.6.17, the urgency can be followed by a space and a 

305 # comment. Extract only the urgency from the string. 

306 if " " in urgency: 306 ↛ 307line 306 didn't jump to line 307 because the condition on line 306 was never true

307 urgency, comment = urgency.split(" ", 1) 

308 if urgency not in cnf.value_list("Urgency::Valid"): 308 ↛ 309line 308 didn't jump to line 309 because the condition on line 308 was never true

309 urgency = cnf["Urgency::Default"] 

310 UrgencyLog().log(control["Source"], control["Version"], urgency) 

311 

312 pu = get_processed_upload(upload) 

313 daklib.announce.announce_accept(pu) 

314 

315 # Move .changes to done, but only for uploads that were accepted to a 

316 # real suite. process-policy will handle this for uploads to queues. 

317 if accepted_to_real_suite: 

318 src = os.path.join(upload.directory, upload.changes.filename) 

319 

320 now = datetime.datetime.now() 

321 donedir = os.path.join(cnf["Dir::Done"], now.strftime("%Y/%m/%d")) 

322 dst = os.path.join(donedir, upload.changes.filename) 

323 dst = utils.find_next_free(dst) 

324 

325 upload.transaction.fs.copy(src, dst, mode=0o644) 

326 

327 SummaryStats().accept_count += 1 

328 SummaryStats().accept_bytes += upload.changes.bytes 

329 

330 

331@try_or_reject 

332def accept_to_new(directory: str, upload: daklib.archive.ArchiveUpload) -> None: 

333 

334 Logger.log(["ACCEPT-TO-NEW", upload.changes.filename]) 

335 print("ACCEPT-TO-NEW") 

336 

337 upload.install_to_new() 

338 # TODO: tag bugs pending 

339 

340 pu = get_processed_upload(upload) 

341 daklib.announce.announce_new(pu) 

342 

343 SummaryStats().accept_count += 1 

344 SummaryStats().accept_bytes += upload.changes.bytes 

345 

346 

347@try_or_reject 

348def reject( 

349 directory: str, 

350 upload: daklib.archive.ArchiveUpload, 

351 reason: str | None = None, 

352 notify=True, 

353) -> None: 

354 real_reject(directory, upload, reason, notify) 

355 

356 

357def real_reject( 

358 directory: str, 

359 upload: daklib.archive.ArchiveUpload, 

360 reason: str | None = None, 

361 notify=True, 

362) -> None: 

363 # XXX: rejection itself should go to daklib.archive.ArchiveUpload 

364 cnf = Config() 

365 

366 Logger.log(["REJECT", upload.changes.filename]) 

367 print("REJECT") 

368 

369 fs = upload.transaction.fs 

370 rejectdir = cnf["Dir::Reject"] 

371 

372 files = [f.filename for f in upload.changes.files.values()] 

373 files.append(upload.changes.filename) 

374 

375 for fn in files: 

376 src = os.path.join(upload.directory, fn) 

377 dst = utils.find_next_free(os.path.join(rejectdir, fn)) 

378 if not os.path.exists(src): 

379 continue 

380 fs.copy(src, dst) 

381 

382 if upload.reject_reasons is not None: 382 ↛ 387line 382 didn't jump to line 387 because the condition on line 382 was always true

383 if reason is None: 383 ↛ 385line 383 didn't jump to line 385 because the condition on line 383 was always true

384 reason = "" 

385 reason = reason + "\n" + "\n".join(upload.reject_reasons) 

386 

387 if reason is None: 387 ↛ 388line 387 didn't jump to line 388 because the condition on line 387 was never true

388 reason = "(Unknown reason. Please check logs.)" 

389 

390 dst = utils.find_next_free( 

391 os.path.join(rejectdir, "{0}.reason".format(upload.changes.filename)) 

392 ) 

393 fh = fs.create(dst) 

394 fh.write(reason) 

395 fh.close() 

396 

397 if notify: 397 ↛ 401line 397 didn't jump to line 401 because the condition on line 397 was always true

398 pu = get_processed_upload(upload) 

399 daklib.announce.announce_reject(pu, reason) 

400 

401 SummaryStats().reject_count += 1 

402 

403 

404############################################################################### 

405 

406 

407def action(directory: str, upload: daklib.archive.ArchiveUpload) -> bool: 

408 changes = upload.changes 

409 processed = True 

410 

411 global Logger 

412 

413 cnf = Config() 

414 

415 okay = upload.check() 

416 

417 try: 

418 summary = changes.changes.get("Changes", "") 

419 except UnicodeDecodeError as e: 

420 summary = "Reading changes failed: %s" % (e) 

421 # the upload checks should have detected this, but make sure this 

422 # upload gets rejected in any case 

423 upload.reject_reasons.append(summary) 

424 

425 package_info = [] 

426 if okay: 

427 if changes.source is not None: 

428 package_info.append("source:{0}".format(changes.source.dsc["Source"])) 

429 for binary in changes.binaries: 

430 package_info.append("binary:{0}".format(binary.control["Package"])) 

431 

432 (prompt, answer) = ("", "XXX") 

433 if Options["No-Action"] or Options["Automatic"]: 433 ↛ 436line 433 didn't jump to line 436 because the condition on line 433 was always true

434 answer = "S" 

435 

436 print(summary) 

437 print() 

438 print("\n".join(package_info)) 

439 print() 

440 if len(upload.warnings) > 0: 

441 print("\n".join(upload.warnings)) 

442 print() 

443 

444 if len(upload.reject_reasons) > 0: 

445 print("Reason:") 

446 print("\n".join(upload.reject_reasons)) 

447 print() 

448 

449 path = os.path.join(directory, changes.filename) 

450 created = os.stat(path).st_mtime 

451 now = time.time() 

452 too_new = now - created < int(cnf["Dinstall::SkipTime"]) 

453 

454 if too_new: 

455 print("SKIP (too new)") 

456 prompt = "[S]kip, Quit ?" 

457 else: 

458 prompt = "[R]eject, Skip, Quit ?" 

459 if Options["Automatic"]: 459 ↛ 470line 459 didn't jump to line 470 because the condition on line 459 was always true

460 answer = "R" 

461 elif upload.new: 

462 prompt = "[N]ew, Skip, Quit ?" 

463 if Options["Automatic"]: 463 ↛ 470line 463 didn't jump to line 470 because the condition on line 463 was always true

464 answer = "N" 

465 else: 

466 prompt = "[A]ccept, Skip, Quit ?" 

467 if Options["Automatic"]: 467 ↛ 470line 467 didn't jump to line 470 because the condition on line 467 was always true

468 answer = "A" 

469 

470 while prompt.find(answer) == -1: 470 ↛ 471line 470 didn't jump to line 471 because the condition on line 470 was never true

471 answer = utils.input_or_exit(prompt) 

472 m = re_default_answer.match(prompt) 

473 if answer == "": 

474 assert m is not None 

475 answer = m.group(1) 

476 answer = answer[:1].upper() 

477 

478 if answer == "R": 

479 reject(directory, upload) 

480 elif answer == "A": 

481 # upload.try_autobyhand must not be run with No-Action. 

482 if Options["No-Action"]: 482 ↛ 483line 482 didn't jump to line 483 because the condition on line 482 was never true

483 accept(directory, upload) 

484 elif upload.try_autobyhand(): 484 ↛ 487line 484 didn't jump to line 487 because the condition on line 484 was always true

485 accept(directory, upload) 

486 else: 

487 print("W: redirecting to BYHAND as automatic processing failed.") 

488 accept_to_new(directory, upload) 

489 elif answer == "N": 

490 accept_to_new(directory, upload) 

491 elif answer == "Q": 491 ↛ 492line 491 didn't jump to line 492 because the condition on line 491 was never true

492 sys.exit(0) 

493 elif answer == "S": 493 ↛ 496line 493 didn't jump to line 496 because the condition on line 493 was always true

494 processed = False 

495 

496 if not Options["No-Action"]: 496 ↛ 499line 496 didn't jump to line 499 because the condition on line 496 was always true

497 upload.commit() 

498 

499 return processed 

500 

501 

502############################################################################### 

503 

504 

505def unlink_if_exists(path: str) -> None: 

506 try: 

507 os.unlink(path) 

508 except OSError as e: 

509 if e.errno != errno.ENOENT: 509 ↛ 510line 509 didn't jump to line 510 because the condition on line 509 was never true

510 raise 

511 

512 

513def process_it( 

514 directory: str, changes: daklib.upload.Changes, keyrings: list[str] 

515) -> None: 

516 global Logger 

517 

518 print("\n{0}\n".format(changes.filename)) 

519 Logger.log(["Processing changes file", changes.filename]) 

520 

521 with daklib.archive.ArchiveUpload(directory, changes, keyrings) as upload: 

522 processed = action(directory, upload) 

523 if processed and not Options["No-Action"]: 

524 session = DBConn().session() 

525 history = SignatureHistory.from_signed_file(upload.changes) 

526 if history.query(session) is None: 526 ↛ 529line 526 didn't jump to line 529 because the condition on line 526 was always true

527 session.add(history) 

528 session.commit() 

529 session.close() 

530 

531 unlink_if_exists(os.path.join(directory, changes.filename)) 

532 for fn in changes.files: 

533 unlink_if_exists(os.path.join(directory, fn)) 

534 

535 

536############################################################################### 

537 

538 

539def _source_group(source: str) -> str: 

540 """decide group for given source name 

541 

542 This is mostly for Secure Boot signing where "X" should be 

543 processed in the same group as (and before) "X-signed-*". 

544 

545 As a further special case, "grub2" needs to be processed in the 

546 same group as (and before) "grub-efi-*-signed". 

547 """ 

548 

549 group = source.split("-", 1)[0] 

550 if group == "grub2": 550 ↛ 551line 550 didn't jump to line 551 because the condition on line 550 was never true

551 return "grub" 

552 return group 

553 

554 

555def _group_changes_by_source_and_shuffle( 

556 changes: list[tuple[str, daklib.upload.Changes]], 

557) -> list[tuple[str, daklib.upload.Changes]]: 

558 """Group changes by Source, sort each group, and shuffle group order.""" 

559 grouped: dict[str, list[tuple[str, daklib.upload.Changes]]] = {} 

560 for directory, change in changes: 

561 source = _source_group(change.changes.get("Source", "")) 

562 grouped.setdefault(source, []).append((directory, change)) 

563 

564 for group in grouped.values(): 

565 group.sort(key=lambda item: item[1]) 

566 

567 source_names = list(grouped) 

568 random.shuffle(source_names) 

569 return [item for source in source_names for item in grouped[source]] 

570 

571 

572############################################################################### 

573 

574 

575def process_changes( 

576 changes_filenames: Iterable[str], 

577 max_duration: datetime.timedelta | None = None, 

578) -> None: 

579 deadline: float | None = None 

580 if max_duration is not None: 580 ↛ 581line 580 didn't jump to line 581 because the condition on line 580 was never true

581 deadline = time.monotonic() + max_duration.total_seconds() 

582 

583 session = DBConn().session() 

584 keyrings = session.query(Keyring).filter_by(active=True).order_by(Keyring.priority) 

585 keyring_files = [k.keyring_name for k in keyrings] 

586 session.close() 

587 

588 changes = [] 

589 for fn in changes_filenames: 

590 try: 

591 directory, filename = os.path.split(fn) 

592 c = daklib.upload.Changes(directory, filename, keyring_files) 

593 changes.append((directory, c)) 

594 except Exception as e: 

595 try: 

596 Logger.log( 

597 [ 

598 filename, 

599 "Error while loading changes file {0}: {1}".format(fn, e), 

600 ] 

601 ) 

602 except Exception as e: 

603 Logger.log( 

604 [ 

605 filename, 

606 "Error while loading changes file {0}, with additional error while printing exception: {1}".format( 

607 fn, repr(e) 

608 ), 

609 ] 

610 ) 

611 

612 changes = _group_changes_by_source_and_shuffle(changes) 

613 

614 for directory, c in changes: 

615 if deadline is not None and time.monotonic() >= deadline: 615 ↛ 616line 615 didn't jump to line 616 because the condition on line 615 was never true

616 Logger.log(["Max duration reached; stopping processing loop"]) 

617 break 

618 process_it(directory, c, keyring_files) 

619 

620 

621############################################################################### 

622 

623 

624def main() -> None: 

625 global Options, Logger 

626 

627 cnf = Config() 

628 summarystats = SummaryStats() 

629 

630 Arguments = [ 

631 ("a", "automatic", "Dinstall::Options::Automatic"), 

632 ("h", "help", "Dinstall::Options::Help"), 

633 ("\0", "max-duration", "Dinstall::Options::Max-Duration", "HasArg"), 

634 ("n", "no-action", "Dinstall::Options::No-Action"), 

635 ("p", "no-lock", "Dinstall::Options::No-Lock"), 

636 ("s", "no-mail", "Dinstall::Options::No-Mail"), 

637 ("d", "directory", "Dinstall::Options::Directory", "HasArg"), 

638 ] 

639 

640 for i in [ 

641 "automatic", 

642 "help", 

643 "max-duration", 

644 "no-action", 

645 "no-lock", 

646 "no-mail", 

647 "version", 

648 "directory", 

649 ]: 

650 key = "Dinstall::Options::%s" % i 

651 if key not in cnf: 

652 cnf[key] = "" 

653 

654 changes_files = apt_pkg.parse_commandline(cnf.Cnf, Arguments, sys.argv) # type: ignore[attr-defined] 

655 Options = cnf.subtree("Dinstall::Options") 

656 

657 if Options["Help"]: 

658 usage() 

659 

660 # -n/--dry-run invalidates some other options which would involve things happening 

661 if Options["No-Action"]: 661 ↛ 662line 661 didn't jump to line 662 because the condition on line 661 was never true

662 Options["Automatic"] = "" # type: ignore[index] 

663 

664 # Obtain lock if not in no-action mode and initialize the log 

665 if not Options["No-Action"]: 665 ↛ 684line 665 didn't jump to line 684 because the condition on line 665 was always true

666 lock_fd = os.open( 

667 os.path.join(cnf["Dir::Lock"], "process-upload.lock"), 

668 os.O_RDWR | os.O_CREAT, 

669 ) 

670 try: 

671 fcntl.flock(lock_fd, fcntl.LOCK_EX | fcntl.LOCK_NB) 

672 except OSError as e: 

673 if e.errno in (errno.EACCES, errno.EAGAIN): 

674 utils.fubar( 

675 "Couldn't obtain lock; assuming another 'dak process-upload' is already running." 

676 ) 

677 else: 

678 raise 

679 

680 # Initialise UrgencyLog() - it will deal with the case where we don't 

681 # want to log urgencies 

682 urgencylog = UrgencyLog() 

683 

684 Logger = daklog.Logger("process-upload", Options["No-Action"]) 

685 

686 # If we have a directory flag, use it to find our files 

687 if cnf["Dinstall::Options::Directory"] != "": 687 ↛ 701line 687 didn't jump to line 701 because the condition on line 687 was always true

688 # Note that we clobber the list of files we were given in this case 

689 # so warn if the user has done both 

690 if len(changes_files) > 0: 690 ↛ 691line 690 didn't jump to line 691 because the condition on line 690 was never true

691 utils.warn("Directory provided so ignoring files given on command line") 

692 

693 changes_files = utils.get_changes_files(cnf["Dinstall::Options::Directory"]) 

694 Logger.log( 

695 [ 

696 "Using changes files from directory", 

697 cnf["Dinstall::Options::Directory"], 

698 len(changes_files), 

699 ] 

700 ) 

701 elif not len(changes_files) > 0: 

702 utils.fubar("No changes files given and no directory specified") 

703 else: 

704 Logger.log(["Using changes files from command-line", len(changes_files)]) 

705 

706 max_duration = None 

707 if Options["Max-Duration"]: 707 ↛ 708line 707 didn't jump to line 708 because the condition on line 707 was never true

708 try: 

709 max_duration = utils.parse_duration(Options["Max-Duration"]) 

710 except ValueError as e: 

711 utils.fubar("Invalid --max-duration: %s" % e) 

712 

713 process_changes(changes_files, max_duration=max_duration) 

714 

715 if summarystats.accept_count: 

716 sets = "set" 

717 if summarystats.accept_count > 1: 

718 sets = "sets" 

719 print( 

720 "Installed %d package %s, %s." 

721 % ( 

722 summarystats.accept_count, 

723 sets, 

724 utils.size_type(int(summarystats.accept_bytes)), 

725 ) 

726 ) 

727 Logger.log(["total", summarystats.accept_count, summarystats.accept_bytes]) 

728 

729 if summarystats.reject_count: 

730 sets = "set" 

731 if summarystats.reject_count > 1: 

732 sets = "sets" 

733 print("Rejected %d package %s." % (summarystats.reject_count, sets)) 

734 Logger.log(["rejected", summarystats.reject_count]) 

735 

736 if not Options["No-Action"]: 736 ↛ 739line 736 didn't jump to line 739 because the condition on line 736 was always true

737 urgencylog.close() 

738 

739 Logger.close() 

740 

741 

742############################################################################### 

743 

744 

745if __name__ == "__main__": 

746 main()