[mirrorbrain-commits] r7965 - in /trunk: docs/installation/ mirrordoctor/ mirrordoctor/mb/ tools/

From: <poeml_at_mirrorbrain.org>
Date: Mon, 08 Mar 2010 20:45:17 -0000
Author: poeml
Date: Mon Mar  8 21:45:15 2010
New Revision: 7965

URL: http://svn.mirrorbrain.org/viewvc/mirrorbrain?rev=7965&view=rev
Log:
- All functionality from tools/metalink-hasher.py was moved inside the "mb" tool.
  There was no reason to let this live in an external script, since the
  functionality is very tightly integrated with MirrorBrain. See issue #40.
- Hashing is twice as fast now, not using the external metalink binary any longer.
- The new command is "mb makehashes". "metalink-hasher" is a wrapper now, for backwards compatibility.
- Now it is easy to support storage of hashes in a database instead of the file system.

Added:
    trunk/mirrordoctor/mb/hashes.py
Modified:
    trunk/docs/installation/source.rst
    trunk/mirrordoctor/mirrordoctor.py
    trunk/tools/metalink-hasher.py

Modified: trunk/docs/installation/source.rst
URL: http://svn.mirrorbrain.org/viewvc/mirrorbrain/trunk/docs/installation/source.rst?rev=7965&r1=7964&r2=7965&view=diff
==============================================================================
--- trunk/docs/installation/source.rst (original)
+++ trunk/docs/installation/source.rst Mon Mar  8 21:45:15 2010
_at_@ -361,14 +361,14 @@
     * First, add some configuration::
 
         MirrorBrainMetalinkPublisher "openSUSE" http://download.opensuse.org
-        MirrorBrainMetalinkHashesPathPrefix /srv/metalink-hashes/srv/opensuse
+        MirrorBrainMetalinkHashesPathPrefix /srv/hashes/srv/opensuse
 
     * install the "metalink" tool from http://metamirrors.nl/metalinks_project
       (openSUSE/Debian/Ubuntu package called metalink, to be found at
       http://download.opensuse.org/repositories/Apache:/MirrorBrain/)
 
     * you need to create a directory where to store the hashes. For instance,
-      :file:`/srv/metalink-hashes/srv/opensuse`. Note that the full pathname to
+      :file:`/srv/hashes/srv/opensuse`. Note that the full pathname to
       the filetree (``/srv/opensuse``) is part of this target path.
       
       Make the directory owned by the ``mirrorbrain`` user.
_at_@ -376,7 +376,7 @@
     * now, create the hashes with the following command. It is best run as
       unprivileged user (``mirrorbrain``)::
 
-        metalink-hasher update /srv/opensuse -t /srv/metalink-hashes/srv/opensuse
+        mb makehashes /srv/opensuse -t /srv/hashes/srv/opensuse
 
 
     * add the hashing command to /etc/crontab to be run every few hours. Alternatively, run

Added: trunk/mirrordoctor/mb/hashes.py
URL: http://svn.mirrorbrain.org/viewvc/mirrorbrain/trunk/mirrordoctor/mb/hashes.py?rev=7965&view=auto
==============================================================================
--- trunk/mirrordoctor/mb/hashes.py (added)
+++ trunk/mirrordoctor/mb/hashes.py Mon Mar  8 21:45:15 2010
_at_@ -1,0 +1,187 @@
+#!/usr/bin/python
+
+import os
+import os.path
+import stat
+
+try:
+    import hashlib
+    md5 = hashlib
+    sha1 = hashlib
+    sha256 = hashlib
+except ImportError:
+    import md5
+    md5 = md5
+    import sha1
+    sha1 = sha1
+    # I guess that Python 2.4 didn't have a sha256 counterpart
+    sha256 = None
+
+PIECESIZE = 262144
+
+
+class Hasheable:
+    """represent a file and its metadata"""
+    def __init__(self, basename, src_dir=None, dst_dir=None):
+        self.basename = basename
+        if src_dir:
+            self.src_dir = src_dir
+        else:
+            self.src_dir = os.path.dirname(self.basename)
+
+        self.src = os.path.join(src_dir, self.basename)
+
+        self.finfo = os.lstat(self.src)
+        self.atime = self.finfo.st_atime
+        self.mtime = self.finfo.st_mtime
+        self.size  = self.finfo.st_size
+        self.inode = self.finfo.st_ino
+        self.mode  = self.finfo.st_mode
+
+        self.dst_dir = dst_dir
+
+        self.dst_basename = '%s.size_%s' % (self.basename, self.size)
+        self.dst = os.path.join(self.dst_dir, self.dst_basename)
+
+    def islink(self):
+        return stat.S_ISLNK(self.mode)
+    def isreg(self):
+        return stat.S_ISREG(self.mode)
+    def isdir(self):
+        return stat.S_ISDIR(self.mode)
+
+    def do_hashes(self, verbose=False, dry_run=False, copy_permissions=True):
+        try:
+            dst_statinfo = os.stat(self.dst)
+            dst_mtime = dst_statinfo.st_mtime
+            dst_size = dst_statinfo.st_size
+        except OSError:
+            dst_mtime = dst_size = 0 # file missing
+
+        if int(dst_mtime) == int(self.mtime) and dst_size != 0:
+            if verbose:
+                print 'Up to date: %r' % self.dst
+            return 
+
+        if dry_run: 
+            print 'Would make hashes for: ', self.src
+            return
+
+        digests = Digests(src = self.src)
+
+        # if present, grab PGP signature
+        if os.path.exists(self.src + '.asc'):
+            digests.pgp = open(self.src + '.asc').read()
+
+        digests.read()
+
+        d = open(self.dst, 'wb')
+        d.write(digests.dump_2_12_template())
+        d.close()
+
+        os.utime(self.dst, (self.atime, self.mtime))
+
+        if copy_permissions:
+            os.chmod(self.dst, self.mode)
+        else:
+            os.chmod(self.dst, 0644)
+
+    #def __eq__(self, other):
+    #    return self.basename == other.basename
+    #def __eq__(self, basename):
+    #    return self.basename == basename
+        
+    def __str__(self):
+        return self.basename
+
+
+
+class Digests():
+    def __init__(self, src):
+        self.src = src
+        self.basename = os.path.basename(src)
+
+        self.md5 = None
+        self.sha1 = None
+        self.sha256 = None
+        self.pgp = None
+
+        self.npieces = 0
+        self.pieces = []
+
+
+    def read(self):
+        m = md5.md5()
+        s1 = sha1.sha1()
+        s256 = sha256.sha256()
+        short_read_before = False
+
+        f = open(self.src, 'rb')
+
+        while 1 + 1 == 2:
+            buf = f.read(PIECESIZE)
+            if not buf: break
+
+            if len(buf) != PIECESIZE:
+                if not short_read_before:
+                    short_read_before = True
+                else:
+                    raise('InternalError')
+
+            m.update(buf)
+            s1.update(buf)
+            s256.update(buf)
+
+            self.npieces += 1
+            self.pieces.append(hashlib.sha1(buf).hexdigest())
+
+        f.close()
+
+        self.md5 = m.hexdigest()
+        self.sha1 = s1.hexdigest()
+        self.sha256 = s256.hexdigest()
+
+    def dump_raw(self):
+        r = []
+        for i in self.pieces:
+            r.append('piece %s' % i)
+        r.append('md5 %s' % self.md5)
+        r.append('sha1 %s' % self.sha1)
+        if sha256:
+            r.append('sha256 %s' % self.sha256)
+        return '\n'.join(r)
+
+
+    def __str__(self):
+        return self.dump_raw()
+
+
+    def dump_2_12_template(self):
+        """dump in the form that was used up to mirrorbrain-2.12.0"""
+
+        r = []
+
+
+        r.append("""      <verification>
+        <hash type="md5">%s</hash>
+        <hash type="sha1">%s</hash>""" % (self.md5, self.sha1))
+        if self.sha256:
+            r.append('        <hash type="sha256">%s</hash>' % (self.sha256))
+
+        if self.pgp:
+            r.append('        <signature type="pgp" file="%s.asc">' % self.basename)
+            r.append(self.pgp)
+            r.append('        </signature>')
+
+        r.append('        <pieces length="%s" type="sha1">' % (PIECESIZE))
+
+        n = 0
+        for piece in self.pieces:
+            r.append('            <hash piece="%s">%s</hash>' % (n, piece))
+            n += 1
+
+        r.append('        </pieces>\n      </verification>\n')
+
+        return '\n'.join(r)
+
+

Modified: trunk/mirrordoctor/mirrordoctor.py
URL: http://svn.mirrorbrain.org/viewvc/mirrorbrain/trunk/mirrordoctor/mirrordoctor.py?rev=7965&r1=7964&r2=7965&view=diff
==============================================================================
--- trunk/mirrordoctor/mirrordoctor.py (original)
+++ trunk/mirrordoctor/mirrordoctor.py Mon Mar  8 21:45:15 2010
_at_@ -826,6 +826,234 @@
         print 'Completed in', mb.util.timer_elapsed()
 
 
+
+    _at_cmdln.option('-n', '--dry-run', action='store_true',
+                        help='don\'t actually do anything, just show what would be done')
+    _at_cmdln.option('--copy-permissions', action='store_true',
+                        help='copy the permissions of directories and files '
+                             'to the hashes files. Normally, this should not '
+                             'be needed, because the hash files don\'t contain '
+                             'any reversible information.')
+    _at_cmdln.option('-f', '--file-mask', metavar='REGEX',
+                        help='regular expression to select files to create hashes for')
+    _at_cmdln.option('-i', '--ignore-mask', metavar='REGEX',
+                        help='regular expression to ignore certain files or directories. '
+                             'If matching a file, no hashes are created for it. '
+                             'If matching a directory, the directory is ignored and '
+                             'deleted in the target tree.')
+    _at_cmdln.option('-b', '--base-dir', metavar='PATH',
+                        help='set the base directory (so that you can work on a subdirectory)')
+    _at_cmdln.option('-t', '--target-dir', metavar='PATH',
+                        help='set a different target directory')
+    _at_cmdln.option('-v', '--verbose', action='store_true',
+                        help='show more information')
+    def do_makehashes(self, subcmd, opts, startdir):
+        """${cmd_name}: Update the verification hashes, e.g. for inclusion into Metalinks
+
+        Examples:
+
+        mb makehashes /srv/mirrors/mozilla -t /srv/metalink-hashes/srv/mirrors/mozilla
+
+        mb makehashes \\
+            -t /srv/metalink-hashes/srv/ftp/pub/opensuse/repositories/home:/poeml \\
+            /srv/ftp-stage/pub/opensuse/repositories/home:/poeml \\
+            -i '^.*/repoview/.*$'
+
+        mb makehashes \\
+            -f '.*.(torrent|iso)$' \\
+            -t /var/lib/apache2/metalink-hashes/srv/ftp/pub/opensuse/distribution/11.0/iso \\
+            -b /srv/ftp-stage/pub/opensuse/distribution/11.0/iso \\
+            /srv/ftp-stage/pub/opensuse/distribution/11.0/iso \\
+            -n
+
+        ${cmd_usage}
+        ${cmd_option_list}
+        """
+
+        import os
+        import fcntl
+        import errno
+        import re
+        import mb.hashes
+
+        if not opts.target_dir:
+            sys.exit('You must specify the target directory (-t)')
+        if not opts.base_dir:
+            opts.base_dir = startdir
+            #sys.exit('You must specify the base directory (-b)')
+
+        if not opts.target_dir.startswith('/'):
+            sys.exit('The target directory must be an absolut path')
+        if not opts.base_dir.startswith('/'):
+            sys.exit('The base directory must be an absolut path')
+
+        startdir = startdir.rstrip('/')
+        opts.target_dir = opts.target_dir.rstrip('/')
+        opts.base_dir = opts.base_dir.rstrip('/')
+
+        if not os.path.exists(startdir):
+            sys.exit('STARTDIR %r does not exist' % startdir) 
+
+        directories_todo = [startdir]
+
+        if opts.ignore_mask: 
+            opts.ignore_mask = re.compile(opts.ignore_mask)
+        if opts.file_mask: 
+            opts.file_mask = re.compile(opts.file_mask)
+
+        unlinked_files = unlinked_dirs = 0
+
+        while len(directories_todo) > 0:
+            src_dir = directories_todo.pop(0)
+
+            try:
+                src_dir_mode = os.stat(src_dir).st_mode
+            except OSError, e:
+                if e.errno == errno.ENOENT:
+                    sys.stderr.write('Directory vanished: %r\n' % src_dir)
+                    continue
+
+            dst_dir = os.path.join(opts.target_dir, src_dir[len(opts.base_dir):].lstrip('/'))
+
+            if not opts.dry_run:
+                if not os.path.isdir(dst_dir):
+                    os.makedirs(dst_dir, mode = 0755)
+                if opts.copy_permissions:
+                    os.chmod(dst_dir, src_dir_mode)
+                else:
+                    os.chmod(dst_dir, 0755)
+
+            try:
+                dst_names = os.listdir(dst_dir)
+                dst_names.sort()
+            except OSError, e:
+                if e.errno == errno.ENOENT:
+                    sys.exit('\nSorry, cannot really continue in dry-run mode, because directory %r does not exist.\n'
+                             'You might want to create it:\n'
+                             '  mkdir %s' % (dst_dir, dst_dir))
+
+
+            # a set offers the fastest access for "foo in ..." lookups
+            src_basenames = set(os.listdir(src_dir))
+
+            if opts.verbose:
+                print 'looking at', src_dir
+
+            dst_keep = set()
+            dst_keep.add('LOCK')
+
+            lockfile = os.path.join(dst_dir, 'LOCK')
+            try:
+                if not opts.dry_run:
+                    lock = open(lockfile, 'w')
+                    fcntl.lockf(lock, fcntl.LOCK_EX | fcntl.LOCK_NB)
+                    try:
+                        os.stat(lockfile)
+                    except OSError, e: 
+                        if e.errno == errno.ENOENT:
+                            if opts.verbose:
+                                print '====== skipping %s, which we were about to lock' % lockfile
+                            continue
+
+                if opts.verbose:
+                    print 'locked %s' % lockfile
+            except IOError, e:
+                if e.errno in [ errno.EAGAIN, errno.EACCES, errno.EWOULDBLOCK ]:
+                    print 'Skipping %r, which is locked' % src_dir
+                    continue
+                else:
+                    raise
+
+
+            for src_basename in sorted(src_basenames):
+                src = os.path.join(src_dir, src_basename)
+
+                if opts.ignore_mask and re.match(opts.ignore_mask, src):
+                    continue
+
+                # stat only once
+                try:
+                    hasheable = mb.hashes.Hasheable(src_basename, 
+                                                    src_dir=src_dir, 
+                                                    dst_dir=dst_dir)
+                except OSError, e:
+                    if e.errno == errno.ENOENT:
+                        sys.stderr.write('File vanished: %r\n' % src)
+                        continue
+
+                if hasheable.islink():
+                    if opts.verbose:
+                        print 'ignoring link', src
+                    continue
+
+                elif hasheable.isreg():
+                    if not opts.file_mask or re.match(opts.file_mask, src_basename):
+                        #if opts.verbose:
+                        #    print 'dst:', dst
+                        hasheable.do_hashes(verbose=opts.verbose, 
+                                            dry_run=opts.dry_run, 
+                                            copy_permissions=opts.copy_permissions)
+                        dst_keep.add(hasheable.dst_basename)
+
+                elif hasheable.isdir():
+                    directories_todo.append(src)  # It's a directory, store it.
+                    dst_keep.add(hasheable.basename)
+
+
+            dst_remove = set(dst_names) - dst_keep
+
+            # print 'files to keep:'
+            # print dst_keep
+            # print
+            # print 'files to remove:'
+            # print dst_remove
+            # print
+
+            for i in sorted(dst_remove):
+                i_path = os.path.join(dst_dir, i)
+                #print i_path
+
+                if (opts.ignore_mask and re.match(opts.ignore_mask, i_path)):
+                    print 'ignoring, not removing %s', i_path
+                    continue
+
+                if os.path.isdir(i_path):
+                    print 'Recursively removing obsolete directory %r' % i_path
+                    if not opts.dry_run: 
+                        try:
+                            shutil.rmtree(i_path)
+                        except OSError, e:
+                            if e.errno == errno.EACCES:
+                                sys.stderr.write('Recursive removing failed for %r (%s). Ignoring.\n' \
+                                                    % (i_path, os.strerror(e.errno)))
+                            else:
+                                sys.exit('Recursive removing failed for %r: %s\n' \
+                                                    % (i_path, os.strerror(e.errno)))
+                    unlinked_dirs += 1
+                    
+                else:
+                    print 'Unlinking obsolete %r' % i_path
+                    if not opts.dry_run: 
+                        try:
+                            os.unlink(i_path)
+                        except OSError, e:
+                            if e.errno != errno.ENOENT:
+                                sys.stderr.write('Unlink failed for %r: %s\n' \
+                                                    % (i_path, os.strerror(e.errno)))
+                    unlinked_files += 1
+
+            if opts.verbose:
+                print 'unlocking', lockfile 
+            if not opts.dry_run:
+                os.unlink(lockfile)
+                lock.close()
+
+        if  unlinked_files or unlinked_dirs:
+            print 'Unlinked %s files, %d directories.' % (unlinked_files, unlinked_dirs)
+
+
+
+
     def do_score(self, subcmd, opts, *args):
         """${cmd_name}: show or change the score of a mirror
 

Modified: trunk/tools/metalink-hasher.py
URL: http://svn.mirrorbrain.org/viewvc/mirrorbrain/trunk/tools/metalink-hasher.py?rev=7965&r1=7964&r2=7965&view=diff
==============================================================================
--- trunk/tools/metalink-hasher.py (original)
+++ trunk/tools/metalink-hasher.py Mon Mar  8 21:45:15 2010
_at_@ -1,18 +1,8 @@
 #!/usr/bin/python
 
 # metalink-hasher -- create metalink hashes
-#
-# The "essence" of what this script does is the following:
-# metalink --nomirrors -d md5 -d sha1 -d sha1pieces "$1" | grep '<.*\(verification\|hash\)>'
-# 
-# This script requires the cmdln module, which you can obtain here:
-# http://trentm.com/projects/cmdln/
-# and the metalink commandline tool, which you can find here:
-# http://metamirrors.nl/metalinks_project
-# 
-# 
 # Copyright 2008,2009,2010 Peter Poeml
-# 
+
 # This program is free software; you can redistribute it and/or
 # modify it under the terms of the GNU General Public License version 2
 # as published by the Free Software Foundation;
_at_@ -27,362 +17,14 @@
 # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301, USA
 
 
-__version__ = '1.2'
-__author__ = 'Peter Poeml <poeml_at_cmdline.net>'
-__copyright__ = 'Peter poeml <poeml_at_cmdline.net>'
-__license__ = 'GPLv2'
-__url__ = 'http://mirrorbrain.org'
 
+# This script was superceded with functionality in the 'mb' tool, 
+# in the beginning of 2010.
 
+import sys
 import os
-import os.path
-import stat
-import shutil
-import cmdln
-import re
-import subprocess
-import errno
-import fcntl
-import signal
 
-line_mask = re.compile('.*</*(verification|hash|pieces).*>.*')
+args = ['mb', 'makehashes'] + sys.argv[2:]
+os.execlp('mb', *args)
 
-class SignalInterrupt(Exception):
-    """Exception raised on SIGTERM and SIGHUP."""
-
-def catchterm(*args):
-    raise SignalInterrupt
-
-for name in 'SIGBREAK', 'SIGHUP', 'SIGTERM':
-    num = getattr(signal, name, None)
-    if num: signal.signal(num, catchterm)
-
-
-class Hasheable:
-    """represent a file and its metadata"""
-    def __init__(self, basename, src_dir=None, dst_dir=None):
-        self.basename = basename
-        if src_dir:
-            self.src_dir = src_dir
-        else:
-            self.src_dir = os.path.dirname(self.basename)
-
-        self.src = os.path.join(src_dir, self.basename)
-
-        self.finfo = os.lstat(self.src)
-        self.atime = self.finfo.st_atime
-        self.mtime = self.finfo.st_mtime
-        self.size  = self.finfo.st_size
-        self.inode = self.finfo.st_ino
-        self.mode  = self.finfo.st_mode
-
-        self.dst_dir = dst_dir
-
-        self.dst_basename = '%s.size_%s' % (self.basename, self.size)
-        self.dst = os.path.join(self.dst_dir, self.dst_basename)
-
-    def islink(self):
-        return stat.S_ISLNK(self.mode)
-    def isreg(self):
-        return stat.S_ISREG(self.mode)
-    def isdir(self):
-        return stat.S_ISDIR(self.mode)
-
-    def do_hashes(self, verbose=False, dry_run=False, copy_permissions=True):
-        try:
-            dst_statinfo = os.stat(self.dst)
-            dst_mtime = dst_statinfo.st_mtime
-            dst_size = dst_statinfo.st_size
-        except OSError:
-            dst_mtime = dst_size = 0 # file missing
-
-        if int(dst_mtime) == int(self.mtime) and dst_size != 0:
-            if verbose:
-                print 'Up to date: %r' % self.dst
-            return 
-
-        cmd = [ 'metalink',
-                '--nomirrors', 
-                '-d', 'md5', 
-                '-d', 'sha1', 
-                '-d', 'sha256', 
-                '-d', 'sha1pieces',
-                self.src ]
-
-        if dry_run: 
-            print 'Would run: ', ' '.join(cmd)
-            return
-
-        sys.stdout.flush()
-        o = subprocess.Popen(cmd, stdout=subprocess.PIPE,
-                        close_fds=True).stdout
-        lines = []
-        for line in o.readlines():
-            if re.match(line_mask, line):
-                line = line.replace('\t\t', ' ' * 6)
-                lines.append(line)
-
-
-        # if present, add PGP signature into the <verification> block
-        if os.path.exists(self.src + '.asc'):
-            sig = open(self.src + '.asc').read()
-            sig = '        <signature type="pgp" file="%s.asc">\n' % self.basename + \
-                  sig + \
-                  '\n        </signature>\n'
-
-            lines.insert(1, sig)
-
-        d = open(self.dst, 'wb')
-        d.write(''.join(lines))
-        d.close()
-
-        os.utime(self.dst, (self.atime, self.mtime))
-
-        if copy_permissions:
-            os.chmod(self.dst, self.mode)
-        else:
-            os.chmod(self.dst, 0644)
-
-    #def __eq__(self, other):
-    #    return self.basename == other.basename
-    #def __eq__(self, basename):
-    #    return self.basename == basename
-        
-    def __str__(self):
-        return self.basename
-
-
-
-class Metalinks(cmdln.Cmdln):
-
-    _at_cmdln.option('-n', '--dry-run', action='store_true',
-                        help='don\'t actually do anything, just show what would be done')
-    _at_cmdln.option('--copy-permissions', action='store_true',
-                        help='copy the permissions of directories and files '
-                             'to the hashes files. Normally, this should not '
-                             'be needed, because the hash files don\'t contain '
-                             'any reversible information.')
-    _at_cmdln.option('-f', '--file-mask', metavar='REGEX',
-                        help='regular expression to select files to create hashes for')
-    _at_cmdln.option('-i', '--ignore-mask', metavar='REGEX',
-                        help='regular expression to ignore certain files or directories. '
-                             'If matching a file, no hashes are created for it. '
-                             'If matching a directory, the directory is ignored and '
-                             'deleted in the target tree.')
-    _at_cmdln.option('-b', '--base-dir', metavar='PATH',
-                        help='set the base directory (so that you can work on a subdirectory)')
-    _at_cmdln.option('-t', '--target-dir', metavar='PATH',
-                        help='set a different target directory')
-    _at_cmdln.option('-v', '--verbose', action='store_true',
-                        help='show more information')
-    def do_update(self, subcmd, opts, startdir):
-        """${cmd_name}: Update the hash pieces that are included in metalinks
-
-        Examples:
-
-        metalink-hasher update /srv/mirrors/mozilla -t /srv/metalink-hashes/srv/mirrors/mozilla
-
-        metalink-hasher update \\
-            -t /srv/metalink-hashes/srv/ftp/pub/opensuse/repositories/home:/poeml \\
-            /srv/ftp-stage/pub/opensuse/repositories/home:/poeml \\
-            -i '^.*/repoview/.*$'
-
-        metalink-hasher update \\
-            -f '.*.(torrent|iso)$' \\
-            -t /var/lib/apache2/metalink-hashes/srv/ftp/pub/opensuse/distribution/11.0/iso \\
-            -b /srv/ftp-stage/pub/opensuse/distribution/11.0/iso \\
-            /srv/ftp-stage/pub/opensuse/distribution/11.0/iso \\
-            -n
-
-        ${cmd_usage}
-        ${cmd_option_list}
-        """
-
-        if not opts.target_dir:
-            sys.exit('You must specify the target directory (-t)')
-        if not opts.base_dir:
-            opts.base_dir = startdir
-            #sys.exit('You must specify the base directory (-b)')
-
-        if not opts.target_dir.startswith('/'):
-            sys.exit('The target directory must be an absolut path')
-        if not opts.base_dir.startswith('/'):
-            sys.exit('The base directory must be an absolut path')
-
-        startdir = startdir.rstrip('/')
-        opts.target_dir = opts.target_dir.rstrip('/')
-        opts.base_dir = opts.base_dir.rstrip('/')
-
-        if not os.path.exists(startdir):
-            sys.exit('STARTDIR %r does not exist' % startdir) 
-
-        directories_todo = [startdir]
-
-        if opts.ignore_mask: 
-            opts.ignore_mask = re.compile(opts.ignore_mask)
-        if opts.file_mask: 
-            opts.file_mask = re.compile(opts.file_mask)
-
-        unlinked_files = unlinked_dirs = 0
-
-        while len(directories_todo) > 0:
-            src_dir = directories_todo.pop(0)
-
-            try:
-                src_dir_mode = os.stat(src_dir).st_mode
-            except OSError, e:
-                if e.errno == errno.ENOENT:
-                    sys.stderr.write('Directory vanished: %r\n' % src_dir)
-                    continue
-
-            dst_dir = os.path.join(opts.target_dir, src_dir[len(opts.base_dir):].lstrip('/'))
-
-            if not opts.dry_run:
-                if not os.path.isdir(dst_dir):
-                    os.makedirs(dst_dir, mode = 0755)
-                if opts.copy_permissions:
-                    os.chmod(dst_dir, src_dir_mode)
-                else:
-                    os.chmod(dst_dir, 0755)
-
-            try:
-                dst_names = os.listdir(dst_dir)
-                dst_names.sort()
-            except OSError, e:
-                if e.errno == errno.ENOENT:
-                    sys.exit('\nSorry, cannot really continue in dry-run mode, because directory %r does not exist.\n'
-                             'You might want to create it:\n'
-                             '  mkdir %s' % (dst_dir, dst_dir))
-
-
-            # a set offers the fastest access for "foo in ..." lookups
-            src_basenames = set(os.listdir(src_dir))
-
-            if opts.verbose:
-                print 'looking at', src_dir
-
-            dst_keep = set()
-            dst_keep.add('LOCK')
-
-            lockfile = os.path.join(dst_dir, 'LOCK')
-            try:
-                if not opts.dry_run:
-                    lock = open(lockfile, 'w')
-                    fcntl.lockf(lock, fcntl.LOCK_EX | fcntl.LOCK_NB)
-                    try:
-                        os.stat(lockfile)
-                    except OSError, e: 
-                        if e.errno == errno.ENOENT:
-                            if opts.verbose:
-                                print '====== skipping %s, which we were about to lock' % lockfile
-                            continue
-
-                if opts.verbose:
-                    print 'locked %s' % lockfile
-            except IOError, e:
-                if e.errno in [ errno.EAGAIN, errno.EACCES, errno.EWOULDBLOCK ]:
-                    print 'Skipping %r, which is locked' % src_dir
-                    continue
-                else:
-                    raise
-
-
-            for src_basename in sorted(src_basenames):
-                src = os.path.join(src_dir, src_basename)
-
-                if opts.ignore_mask and re.match(opts.ignore_mask, src):
-                    continue
-
-                # stat only once
-                try:
-                    hasheable = Hasheable(src_basename, src_dir=src_dir, dst_dir=dst_dir)
-                except OSError, e:
-                    if e.errno == errno.ENOENT:
-                        sys.stderr.write('File vanished: %r\n' % src)
-                        continue
-
-                if hasheable.islink():
-                    if opts.verbose:
-                        print 'ignoring link', src
-                    continue
-
-                elif hasheable.isreg():
-                    if not opts.file_mask or re.match(opts.file_mask, src_basename):
-                        #if opts.verbose:
-                        #    print 'dst:', dst
-                        hasheable.do_hashes(verbose=opts.verbose, 
-                                            dry_run=opts.dry_run, 
-                                            copy_permissions=opts.copy_permissions)
-                        dst_keep.add(hasheable.dst_basename)
-
-                elif hasheable.isdir():
-                    directories_todo.append(src)  # It's a directory, store it.
-                    dst_keep.add(hasheable.basename)
-
-
-            dst_remove = set(dst_names) - dst_keep
-
-            # print 'files to keep:'
-            # print dst_keep
-            # print
-            # print 'files to remove:'
-            # print dst_remove
-            # print
-
-            for i in sorted(dst_remove):
-                i_path = os.path.join(dst_dir, i)
-                #print i_path
-
-                if (opts.ignore_mask and re.match(opts.ignore_mask, i_path)):
-                    print 'ignoring, not removing %s', i_path
-                    continue
-
-                if os.path.isdir(i_path):
-                    print 'Recursively removing obsolete directory %r' % i_path
-                    if not opts.dry_run: 
-                        try:
-                            shutil.rmtree(i_path)
-                        except OSError, e:
-                            if e.errno == errno.EACCES:
-                                sys.stderr.write('Recursive removing failed for %r (%s). Ignoring.\n' \
-                                                    % (i_path, os.strerror(e.errno)))
-                            else:
-                                sys.exit('Recursive removing failed for %r: %s\n' \
-                                                    % (i_path, os.strerror(e.errno)))
-                    unlinked_dirs += 1
-                    
-                else:
-                    print 'Unlinking obsolete %r' % i_path
-                    if not opts.dry_run: 
-                        try:
-                            os.unlink(i_path)
-                        except OSError, e:
-                            if e.errno != errno.ENOENT:
-                                sys.stderr.write('Unlink failed for %r: %s\n' \
-                                                    % (i_path, os.strerror(e.errno)))
-                    unlinked_files += 1
-
-            if opts.verbose:
-                print 'unlocking', lockfile 
-            if not opts.dry_run:
-                os.unlink(lockfile)
-                lock.close()
-
-        if  unlinked_files or unlinked_dirs:
-            print 'Unlinked %s files, %d directories.' % (unlinked_files, unlinked_dirs)
-
-
-
-if __name__ == '__main__':
-    import sys
-
-    try:
-        metalinks = Metalinks()
-        sys.exit( metalinks.main() )
-
-    except SignalInterrupt:
-        print >>sys.stderr, 'killed!'
-
-    except KeyboardInterrupt:
-        print >>sys.stderr, 'interrupted!'
-
+sys.exit(0)




_______________________________________________
mirrorbrain-commits mailing list
Archive: http://mirrorbrain.org/archive/mirrorbrain-commits/

Note: To remove yourself from this list, send a mail with the content
 	unsubscribe
to the address mirrorbrain-commits-request_at_mirrorbrain.org
Received on Mon Mar 08 2010 - 20:45:21 GMT

This archive was generated by hypermail 2.3.0 : Mon Feb 20 2012 - 23:47:04 GMT