# HG changeset patch # User ymh # Date 1306422939 -7200 # Node ID 896db0083b7669b4d7ff328108dfe4cfa1b31830 first commit diff -r 000000000000 -r 896db0083b76 .hgignore --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/.hgignore Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,10 @@ + +syntax: glob +virtualenv/web/env/* + +syntax: regexp +^web/hdabo/config\.py$ +^web/hdabo/.htaccess$ + +syntax: regexp +^web/static/site$ \ No newline at end of file diff -r 000000000000 -r 896db0083b76 .project --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/.project Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,17 @@ + + + hdabo + + + + + + org.python.pydev.PyDevBuilder + + + + + + org.python.pydev.pythonNature + + diff -r 000000000000 -r 896db0083b76 .pydevproject --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/.pydevproject Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,10 @@ + + + + +python_hdabo +python 2.7 + +/hdabo/web + + diff -r 000000000000 -r 896db0083b76 .settings/org.eclipse.core.resources.prefs --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/.settings/org.eclipse.core.resources.prefs Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,4 @@ +#Thu May 26 16:39:53 CEST 2011 +eclipse.preferences.version=1 +encoding//web/hdabo/management/commands/importcsv.py=utf-8 +encoding//web/hdabo/models.py=utf-8 diff -r 000000000000 -r 896db0083b76 sbin/.keepme diff -r 000000000000 -r 896db0083b76 sql/.keepme diff -r 000000000000 -r 896db0083b76 virtualenv/.keepme diff -r 000000000000 -r 896db0083b76 virtualenv/res/lib/lib_create_env.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/virtualenv/res/lib/lib_create_env.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,291 @@ +import sys +import os +import os.path +import shutil +import tarfile +import zipfile +import urllib +import platform +import patch + +join = os.path.join +system_str = platform.system() + + +URLS = { + #'': {'setup': '', 'url':'', 'local':''}, + 'DISTRIBUTE': {'setup': 'distribute', 'url':'http://pypi.python.org/packages/source/d/distribute/distribute-0.6.14.tar.gz', 'local':"distribute-0.6.14.tar.gz"}, + 'DJANGO': {'setup': 'django', 'url': 'http://www.djangoproject.com/download/1.3/tarball/', 'local':"Django-1.3.tar.gz"}, + 'DJANGO-EXTENSIONS': { 'setup': 'django-extensions', 'url':'https://github.com/django-extensions/django-extensions/tarball/0.6', 'local':"django-extensions-0.6.tar.gz"}, + 'SOUTH': { 'setup': 'South', 'url':'http://www.aeracode.org/releases/south/south-0.7.3.tar.gz', 'local':"south-0.7.3.tar.gz"}, + 'HTTPLIB2': { 'setup': 'python-httplib2', 'url':'http://httplib2.googlecode.com/files/httplib2-0.6.0.tar.gz', 'local':"httplib2-0.6.0.tar.gz"}, + 'HAYSTACK': { 'setup': 'haystack', 'url':'https://github.com/toastdriven/django-haystack/zipball/v1.2.2', 'local': "django-haystack-v1.2.2.tar.gz"}, + 'WHOOSH' : { 'setup': 'Whoosh', 'url': 'https://bitbucket.org/mchaput/whoosh/get/tip.tar.bz2', 'local': 'whoosh-1.8.3.tar.bz2'}, +} + +if system_str == 'Windows': + URLS.update({ + 'PSYCOPG2': {'setup': 'psycopg2','url': 'psycopg2-2.0.14.win32-py2.6-pg8.4.3-release.zip', 'local':"psycopg2-2.0.14.win32-py2.6-pg8.4.3-release.zip"}, + 'JCC': {'setup': 'jcc', 'url': 'http://pylucene-win32-binary.googlecode.com/files/JCC-2.6-py2.6-win32.egg', 'local':"JCC-2.6-py2.6-win32.egg"}, + 'PYLUCENE': {'setup': 'pylucene', 'url': 'http://pylucene-win32-binary.googlecode.com/files/lucene-3.0.2-py2.6-win32.egg', 'local':"lucene-3.0.2-py2.6-win32.egg"}, + 'PIL': {'setup': 'pil', 'url': 'http://effbot.org/media/downloads/PIL-1.1.7.win32-py2.6.exe', 'local':"PIL-1.1.7.win32-py2.6.exe"}, + 'LXML': {'setup': 'lxml', 'url': 'http://pypi.python.org/packages/2.6/l/lxml/lxml-2.2.2-py2.6-win32.egg', 'local':"lxml-2.2.2-py2.6-win32.egg"} + }) +else: + URLS.update({ + 'PSYCOPG2': {'setup': 'psycopg2','url': 'http://www.psycopg.org/psycopg/tarballs/PSYCOPG-2-4/psycopg2-2.4.tar.gz', 'local':"psycopg2-2.4.tar.gz"}, + 'PYLUCENE': {'setup': 'pylucene', 'url': 'http://apache.crihan.fr/dist//lucene/pylucene/pylucene-3.1.0-1-src.tar.gz', 'local':"pylucene-3.1.0-1-src.tar.gz"}, + 'PIL': {'setup': 'pil', 'url': 'http://effbot.org/downloads/Imaging-1.1.7.tar.gz', 'local':"Imaging-1.1.7.tar.gz"}, + 'LXML': {'setup': 'lxml', 'url':"lxml-2.3.tar.bz2", 'local':"lxml-2.3.tar.bz2"} + }) + + + +class ResourcesEnv(object): + + def __init__(self, src_base, urls, normal_installs): + self.src_base = src_base + self.URLS = {} + self.__init_url(urls) + self.NORMAL_INSTALL = normal_installs + + def get_src_base_path(self, fpath): + return os.path.abspath(os.path.join(self.src_base, fpath)).replace("\\","/") + + def __add_package_def(self, key, setup, url, local): + self.URLS[key] = {'setup':setup, 'url':url, 'local':self.get_src_base_path(local)} + + def __init_url(self, urls): + for key, url_dict in urls.items(): + url = url_dict['url'] + if not url.startswith("http://"): + url = self.get_src_base_path(url) + self.__add_package_def(key, url_dict["setup"], url, url_dict["local"]) + +def ensure_dir(dir, logger): + if not os.path.exists(dir): + logger.notify('Creating directory %s' % dir) + os.makedirs(dir) + +def extend_parser(parser): + parser.add_option( + '--index-url', + metavar='INDEX_URL', + dest='index_url', + default='http://pypi.python.org/simple/', + help='base URL of Python Package Index') + parser.add_option( + '--type-install', + metavar='type_install', + dest='type_install', + default='local', + help='type install : local, url, setup') + parser.add_option( + '--ignore-packages', + metavar='ignore_packages', + dest='ignore_packages', + default=None, + help='list of comma separated keys for package to ignore') + +def adjust_options(options, args): + pass + + +def install_pylucene(option_str, extra_env, res_source_key, home_dir, lib_dir, tmp_dir, src_dir, res_env, logger, call_subprocess, filter_python_develop): + + logger.notify("Get Pylucene from %s " % res_env.URLS['PYLUCENE'][res_source_key]) + pylucene_src = os.path.join(src_dir,"pylucene.tar.gz") + if res_source_key == 'local': + shutil.copy(res_env.URLS['PYLUCENE'][res_source_key], pylucene_src) + else: + urllib.urlretrieve(res_env.URLS['PYLUCENE'][res_source_key], pylucene_src) + tf = tarfile.open(pylucene_src,'r:gz') + pylucene_base_path = os.path.join(src_dir,"pylucene") + logger.notify("Extract Pylucene to %s " % pylucene_base_path) + tf.extractall(pylucene_base_path) + tf.close() + + pylucene_src_path = os.path.join(pylucene_base_path, os.listdir(pylucene_base_path)[0]) + jcc_src_path = os.path.abspath(os.path.join(pylucene_src_path,"jcc")) + + #install jcc + + #patch for linux + if system_str == 'Linux' : + olddir = os.getcwd() + setuptools_path = os.path.join(lib_dir, 'site-packages', 'setuptools') + if os.path.exists(setuptools_path) and os.path.isdir(setuptools_path): + patch_dest_path = os.path.join(lib_dir, 'site-packages') + else: + patch_dest_path = os.path.join(lib_dir,'site-packages','setuptools-0.6c11-py%s.%s.egg' % (sys.version_info[0], sys.version_info[1])) + if os.path.isfile(patch_dest_path): + # must unzip egg + # rename file and etract all + shutil.move(patch_dest_path, patch_dest_path + ".zip") + zf = zipfile.ZipFile(patch_dest_path + ".zip",'r') + zf.extractall(patch_dest_path) + os.remove(patch_dest_path + ".zip") + logger.notify("Patch jcc : %s " % (patch_dest_path)) + os.chdir(patch_dest_path) + p = patch.fromfile(os.path.join(jcc_src_path,"jcc","patches","patch.43.0.6c11")) + p.apply() + os.chdir(olddir) + + logger.notify("Install jcc") + call_subprocess([os.path.abspath(os.path.join(home_dir, 'bin', 'python')), 'setup.py', 'install'], + cwd=jcc_src_path, + filter_stdout=filter_python_develop, + show_stdout=True) + #install pylucene + + logger.notify("Install pylucene") + #modify makefile + makefile_path = os.path.join(pylucene_src_path,"Makefile") + logger.notify("Modify makefile %s " % makefile_path) + shutil.move( makefile_path, makefile_path+"~" ) + + destination= open( makefile_path, "w" ) + source= open( makefile_path+"~", "r" ) + destination.write("PREFIX_PYTHON="+os.path.abspath(home_dir)+"\n") + destination.write("ANT=ant\n") + destination.write("PYTHON=$(PREFIX_PYTHON)/bin/python\n") + + if system_str == "Darwin": + if sys.version_info >= (2,6): + destination.write("JCC=$(PYTHON) -m jcc.__main__ --shared --arch x86_64 --arch i386\n") + else: + destination.write("JCC=$(PYTHON) -m jcc --shared --arch x86_64 --arch i386\n") + destination.write("NUM_FILES=2\n") + elif system_str == "Windows": + destination.write("JCC=$(PYTHON) -m jcc.__main__ --shared --arch x86_64 --arch i386\n") + destination.write("NUM_FILES=2\n") + else: + if sys.version_info >= (2,6) and sys.version_info <= (2,7): + destination.write("JCC=$(PYTHON) -m jcc.__main__ --shared\n") + else: + destination.write("JCC=$(PYTHON) -m jcc --shared\n") + destination.write("NUM_FILES=2\n") + for line in source: + destination.write( line ) + source.close() + destination.close() + os.remove(makefile_path+"~" ) + + logger.notify("pylucene make") + call_subprocess(['make'], + cwd=os.path.abspath(pylucene_src_path), + filter_stdout=filter_python_develop, + show_stdout=True) + + logger.notify("pylucene make install") + call_subprocess(['make', 'install'], + cwd=os.path.abspath(pylucene_src_path), + filter_stdout=filter_python_develop, + show_stdout=True) + + +def install_psycopg2(option_str, extra_env, res_source_key, home_dir, lib_dir, tmp_dir, src_dir, res_env, logger, call_subprocess, filter_python_develop): + psycopg2_src = os.path.join(src_dir,"psycopg2.zip") + shutil.copy(res_env.URLS['PSYCOPG2'][res_source_key], psycopg2_src) + #extract psycopg2 + zf = zipfile.ZipFile(psycopg2_src) + psycopg2_base_path = os.path.join(src_dir,"psycopg2") + zf.extractall(psycopg2_base_path) + zf.close() + + psycopg2_src_path = os.path.join(psycopg2_base_path, os.listdir(psycopg2_base_path)[0]) + shutil.copytree(os.path.join(psycopg2_src_path, 'psycopg2'), os.path.abspath(os.path.join(home_dir, 'Lib/site-packages', 'psycopg2'))) + shutil.copy(os.path.join(psycopg2_src_path, 'psycopg2-2.0.14-py2.6.egg-info'), os.path.abspath(os.path.join(home_dir, 'Lib/site-packages', 'site-packages'))) + + + +def lib_generate_install_methods(path_locations, src_base, Logger, call_subprocess, normal_installs, urls=None): + + all_urls = URLS.copy() + if urls is not None: + all_urls.update(urls) + + res_env = ResourcesEnv(src_base, all_urls, normal_installs) + + def filter_python_develop(line): + if not line.strip(): + return Logger.DEBUG + for prefix in ['Searching for', 'Reading ', 'Best match: ', 'Processing ', + 'Moving ', 'Adding ', 'running ', 'writing ', 'Creating ', + 'creating ', 'Copying ']: + if line.startswith(prefix): + return Logger.DEBUG + return Logger.NOTIFY + + + def normal_install(key, method, option_str, extra_env, res_source_key, home_dir, tmp_dir, res_env, logger, call_subprocess): + logger.notify("Install %s from %s with %s" % (key,res_env.URLS[key][res_source_key],method)) + if method == 'pip': + if sys.platform == 'win32': + args = [os.path.abspath(os.path.join(home_dir, 'Scripts', 'pip')), 'install', '-E', os.path.abspath(home_dir), res_env.URLS[key][res_source_key]] + else: + args = [os.path.abspath(os.path.join(home_dir, 'bin', 'pip')), 'install', '-E', os.path.abspath(home_dir), res_env.URLS[key][res_source_key]] + if option_str : + args.insert(4,option_str) + call_subprocess(args, + cwd=os.path.abspath(tmp_dir), + filter_stdout=filter_python_develop, + show_stdout=True, + extra_env=extra_env) + else: + if sys.platform == 'win32': + args = [os.path.abspath(os.path.join(home_dir, 'Scripts', 'easy_install')), res_env.URLS[key][res_source_key]] + else: + args = [os.path.abspath(os.path.join(home_dir, 'bin', 'easy_install')), res_env.URLS[key][res_source_key]] + if option_str : + args.insert(1,option_str) + call_subprocess(args, + cwd=os.path.abspath(tmp_dir), + filter_stdout=filter_python_develop, + show_stdout=True, + extra_env=extra_env) + + + def after_install(options, home_dir): + + global logger + + verbosity = options.verbose - options.quiet + logger = Logger([(Logger.level_for_integer(2-verbosity), sys.stdout)]) + + + home_dir, lib_dir, inc_dir, bin_dir = path_locations(home_dir) + base_dir = os.path.dirname(home_dir) + src_dir = os.path.join(home_dir, 'src') + tmp_dir = os.path.join(home_dir, 'tmp') + ensure_dir(src_dir, logger) + ensure_dir(tmp_dir, logger) + system_str = platform.system() + + res_source_key = options.type_install + + ignore_packages = [] + + if options.ignore_packages : + ignore_packages = options.ignore_packages.split(",") + + logger.indent += 2 + try: + for key, method, option_str, extra_env in res_env.NORMAL_INSTALL: + if key not in ignore_packages: + if callable(method): + method(option_str, extra_env, res_source_key, home_dir, lib_dir, tmp_dir, src_dir, res_env, logger, call_subprocess, filter_python_develop) + else: + normal_install(key, method, option_str, extra_env, res_source_key, home_dir, tmp_dir, res_env, logger, call_subprocess) + + logger.notify("Clear source dir") + shutil.rmtree(src_dir) + + finally: + logger.indent -= 2 + script_dir = join(base_dir, bin_dir) + logger.notify('Run "%s Package" to install new packages that provide builds' + % join(script_dir, 'easy_install')) + + + return adjust_options, extend_parser, after_install diff -r 000000000000 -r 896db0083b76 virtualenv/res/lib/patch.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/virtualenv/res/lib/patch.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,589 @@ +""" Patch utility to apply unified diffs + + Brute-force line-by-line non-recursive parsing + + Copyright (c) 2008-2010 anatoly techtonik + Available under the terms of MIT license + + Project home: http://code.google.com/p/python-patch/ + + + $Id: patch.py 76 2010-04-08 19:10:21Z techtonik $ + $HeadURL: https://python-patch.googlecode.com/svn/trunk/patch.py $ +""" + +__author__ = "techtonik.rainforce.org" +__version__ = "10.04" + +import copy +import logging +import re +# cStringIO doesn't support unicode in 2.5 +from StringIO import StringIO +from logging import debug, info, warning + +from os.path import exists, isfile, abspath +from os import unlink + + +#------------------------------------------------ +# Logging is controlled by "python_patch" logger + +debugmode = False + +logger = logging.getLogger("python_patch") +loghandler = logging.StreamHandler() +logger.addHandler(loghandler) + +debug = logger.debug +info = logger.info +warning = logger.warning + +#: disable library logging by default +logger.setLevel(logging.CRITICAL) + +#------------------------------------------------ + + +def fromfile(filename): + """ Parse patch file and return Patch() object + """ + + info("reading patch from file %s" % filename) + fp = open(filename, "rb") + patch = Patch(fp) + fp.close() + return patch + + +def fromstring(s): + """ Parse text string and return Patch() object + """ + + return Patch( + StringIO.StringIO(s) + ) + + + +class HunkInfo(object): + """ Parsed hunk data container (hunk starts with @@ -R +R @@) """ + + def __init__(self): + self.startsrc=None #: line count starts with 1 + self.linessrc=None + self.starttgt=None + self.linestgt=None + self.invalid=False + self.text=[] + + def copy(self): + return copy.copy(self) + +# def apply(self, estream): +# """ write hunk data into enumerable stream +# return strings one by one until hunk is +# over +# +# enumerable stream are tuples (lineno, line) +# where lineno starts with 0 +# """ +# pass + + + +class Patch(object): + + def __init__(self, stream=None): + + # define Patch data members + # table with a row for every source file + + #: list of source filenames + self.source=None + self.target=None + #: list of lists of hunks + self.hunks=None + #: file endings statistics for every hunk + self.hunkends=None + + if stream: + self.parse(stream) + + def copy(self): + return copy.copy(self) + + def parse(self, stream): + """ parse unified diff """ + self.source = [] + self.target = [] + self.hunks = [] + self.hunkends = [] + + # define possible file regions that will direct the parser flow + header = False # comments before the patch body + filenames = False # lines starting with --- and +++ + + hunkhead = False # @@ -R +R @@ sequence + hunkbody = False # + hunkskip = False # skipping invalid hunk mode + + header = True + lineends = dict(lf=0, crlf=0, cr=0) + nextfileno = 0 + nexthunkno = 0 #: even if index starts with 0 user messages number hunks from 1 + + # hunkinfo holds parsed values, hunkactual - calculated + hunkinfo = HunkInfo() + hunkactual = dict(linessrc=None, linestgt=None) + + fe = enumerate(stream) + for lineno, line in fe: + + # analyze state + if header and line.startswith("--- "): + header = False + # switch to filenames state + filenames = True + #: skip hunkskip and hunkbody code until you read definition of hunkhead + if hunkbody: + # process line first + if re.match(r"^[- \+\\]", line): + # gather stats about line endings + if line.endswith("\r\n"): + self.hunkends[nextfileno-1]["crlf"] += 1 + elif line.endswith("\n"): + self.hunkends[nextfileno-1]["lf"] += 1 + elif line.endswith("\r"): + self.hunkends[nextfileno-1]["cr"] += 1 + + if line.startswith("-"): + hunkactual["linessrc"] += 1 + elif line.startswith("+"): + hunkactual["linestgt"] += 1 + elif not line.startswith("\\"): + hunkactual["linessrc"] += 1 + hunkactual["linestgt"] += 1 + hunkinfo.text.append(line) + # todo: handle \ No newline cases + else: + warning("invalid hunk no.%d at %d for target file %s" % (nexthunkno, lineno+1, self.target[nextfileno-1])) + # add hunk status node + self.hunks[nextfileno-1].append(hunkinfo.copy()) + self.hunks[nextfileno-1][nexthunkno-1]["invalid"] = True + # switch to hunkskip state + hunkbody = False + hunkskip = True + + # check exit conditions + if hunkactual["linessrc"] > hunkinfo.linessrc or hunkactual["linestgt"] > hunkinfo.linestgt: + warning("extra hunk no.%d lines at %d for target %s" % (nexthunkno, lineno+1, self.target[nextfileno-1])) + # add hunk status node + self.hunks[nextfileno-1].append(hunkinfo.copy()) + self.hunks[nextfileno-1][nexthunkno-1]["invalid"] = True + # switch to hunkskip state + hunkbody = False + hunkskip = True + elif hunkinfo.linessrc == hunkactual["linessrc"] and hunkinfo.linestgt == hunkactual["linestgt"]: + self.hunks[nextfileno-1].append(hunkinfo.copy()) + # switch to hunkskip state + hunkbody = False + hunkskip = True + + # detect mixed window/unix line ends + ends = self.hunkends[nextfileno-1] + if ((ends["cr"]!=0) + (ends["crlf"]!=0) + (ends["lf"]!=0)) > 1: + warning("inconsistent line ends in patch hunks for %s" % self.source[nextfileno-1]) + if debugmode: + debuglines = dict(ends) + debuglines.update(file=self.target[nextfileno-1], hunk=nexthunkno) + debug("crlf: %(crlf)d lf: %(lf)d cr: %(cr)d\t - file: %(file)s hunk: %(hunk)d" % debuglines) + + if hunkskip: + match = re.match("^@@ -(\d+)(,(\d+))? \+(\d+)(,(\d+))?", line) + if match: + # switch to hunkhead state + hunkskip = False + hunkhead = True + elif line.startswith("--- "): + # switch to filenames state + hunkskip = False + filenames = True + if debugmode and len(self.source) > 0: + debug("- %2d hunks for %s" % (len(self.hunks[nextfileno-1]), self.source[nextfileno-1])) + + if filenames: + if line.startswith("--- "): + if nextfileno in self.source: + warning("skipping invalid patch for %s" % self.source[nextfileno]) + del self.source[nextfileno] + # double source filename line is encountered + # attempt to restart from this second line + re_filename = "^--- ([^\t]+)" + match = re.match(re_filename, line) + # todo: support spaces in filenames + if match: + self.source.append(match.group(1).strip()) + else: + warning("skipping invalid filename at line %d" % lineno) + # switch back to header state + filenames = False + header = True + elif not line.startswith("+++ "): + if nextfileno in self.source: + warning("skipping invalid patch with no target for %s" % self.source[nextfileno]) + del self.source[nextfileno] + else: + # this should be unreachable + warning("skipping invalid target patch") + filenames = False + header = True + else: + if nextfileno in self.target: + warning("skipping invalid patch - double target at line %d" % lineno) + del self.source[nextfileno] + del self.target[nextfileno] + nextfileno -= 1 + # double target filename line is encountered + # switch back to header state + filenames = False + header = True + else: + re_filename = "^\+\+\+ ([^\t]+)" + match = re.match(re_filename, line) + if not match: + warning("skipping invalid patch - no target filename at line %d" % lineno) + # switch back to header state + filenames = False + header = True + else: + self.target.append(match.group(1).strip()) + nextfileno += 1 + # switch to hunkhead state + filenames = False + hunkhead = True + nexthunkno = 0 + self.hunks.append([]) + self.hunkends.append(lineends.copy()) + continue + + if hunkhead: + match = re.match("^@@ -(\d+)(,(\d+))? \+(\d+)(,(\d+))?", line) + if not match: + if nextfileno-1 not in self.hunks: + warning("skipping invalid patch with no hunks for file %s" % self.target[nextfileno-1]) + # switch to header state + hunkhead = False + header = True + continue + else: + # switch to header state + hunkhead = False + header = True + else: + hunkinfo.startsrc = int(match.group(1)) + hunkinfo.linessrc = 1 + if match.group(3): hunkinfo.linessrc = int(match.group(3)) + hunkinfo.starttgt = int(match.group(4)) + hunkinfo.linestgt = 1 + if match.group(6): hunkinfo.linestgt = int(match.group(6)) + hunkinfo.invalid = False + hunkinfo.text = [] + + hunkactual["linessrc"] = hunkactual["linestgt"] = 0 + + # switch to hunkbody state + hunkhead = False + hunkbody = True + nexthunkno += 1 + continue + else: + if not hunkskip: + warning("patch file incomplete - %s" % filename) + # sys.exit(?) + else: + # duplicated message when an eof is reached + if debugmode and len(self.source) > 0: + debug("- %2d hunks for %s" % (len(self.hunks[nextfileno-1]), self.source[nextfileno-1])) + + info("total files: %d total hunks: %d" % (len(self.source), sum(len(hset) for hset in self.hunks))) + + + def apply(self): + """ apply parsed patch """ + + total = len(self.source) + for fileno, filename in enumerate(self.source): + + f2patch = filename + if not exists(f2patch): + f2patch = self.target[fileno] + if not exists(f2patch): + warning("source/target file does not exist\n--- %s\n+++ %s" % (filename, f2patch)) + continue + if not isfile(f2patch): + warning("not a file - %s" % f2patch) + continue + filename = f2patch + + info("processing %d/%d:\t %s" % (fileno+1, total, filename)) + + # validate before patching + f2fp = open(filename) + hunkno = 0 + hunk = self.hunks[fileno][hunkno] + hunkfind = [] + hunkreplace = [] + validhunks = 0 + canpatch = False + for lineno, line in enumerate(f2fp): + if lineno+1 < hunk.startsrc: + continue + elif lineno+1 == hunk.startsrc: + hunkfind = [x[1:].rstrip("\r\n") for x in hunk.text if x[0] in " -"] + hunkreplace = [x[1:].rstrip("\r\n") for x in hunk.text if x[0] in " +"] + #pprint(hunkreplace) + hunklineno = 0 + + # todo \ No newline at end of file + + # check hunks in source file + if lineno+1 < hunk.startsrc+len(hunkfind)-1: + if line.rstrip("\r\n") == hunkfind[hunklineno]: + hunklineno+=1 + else: + debug("hunk no.%d doesn't match source file %s" % (hunkno+1, filename)) + # file may be already patched, but we will check other hunks anyway + hunkno += 1 + if hunkno < len(self.hunks[fileno]): + hunk = self.hunks[fileno][hunkno] + continue + else: + break + + # check if processed line is the last line + if lineno+1 == hunk.startsrc+len(hunkfind)-1: + debug("file %s hunk no.%d -- is ready to be patched" % (filename, hunkno+1)) + hunkno+=1 + validhunks+=1 + if hunkno < len(self.hunks[fileno]): + hunk = self.hunks[fileno][hunkno] + else: + if validhunks == len(self.hunks[fileno]): + # patch file + canpatch = True + break + else: + if hunkno < len(self.hunks[fileno]): + warning("premature end of source file %s at hunk %d" % (filename, hunkno+1)) + + f2fp.close() + + if validhunks < len(self.hunks[fileno]): + if self._match_file_hunks(filename, self.hunks[fileno]): + warning("already patched %s" % filename) + else: + warning("source file is different - %s" % filename) + if canpatch: + backupname = filename+".orig" + if exists(backupname): + warning("can't backup original file to %s - aborting" % backupname) + else: + import shutil + shutil.move(filename, backupname) + if self.write_hunks(backupname, filename, self.hunks[fileno]): + warning("successfully patched %s" % filename) + unlink(backupname) + else: + warning("error patching file %s" % filename) + shutil.copy(filename, filename+".invalid") + warning("invalid version is saved to %s" % filename+".invalid") + # todo: proper rejects + shutil.move(backupname, filename) + + # todo: check for premature eof + + + def can_patch(self, filename): + """ Check if specified filename can be patched. Returns None if file can + not be found among source filenames. False if patch can not be applied + clearly. True otherwise. + + :returns: True, False or None + """ + idx = self._get_file_idx(filename, source=True) + if idx == None: + return None + return self._match_file_hunks(filename, self.hunks[idx]) + + + def _match_file_hunks(self, filepath, hunks): + matched = True + fp = open(abspath(filepath)) + + class NoMatch(Exception): + pass + + lineno = 1 + line = fp.readline() + hno = None + try: + for hno, h in enumerate(hunks): + # skip to first line of the hunk + while lineno < h.starttgt: + if not len(line): # eof + debug("check failed - premature eof before hunk: %d" % (hno+1)) + raise NoMatch + line = fp.readline() + lineno += 1 + for hline in h.text: + if hline.startswith("-"): + continue + if not len(line): + debug("check failed - premature eof on hunk: %d" % (hno+1)) + # todo: \ No newline at the end of file + raise NoMatch + if line.rstrip("\r\n") != hline[1:].rstrip("\r\n"): + debug("file is not patched - failed hunk: %d" % (hno+1)) + raise NoMatch + line = fp.readline() + lineno += 1 + + except NoMatch: + matched = False + # todo: display failed hunk, i.e. expected/found + + fp.close() + return matched + + + def patch_stream(self, instream, hunks): + """ Generator that yields stream patched with hunks iterable + + Converts lineends in hunk lines to the best suitable format + autodetected from input + """ + + # todo: At the moment substituted lineends may not be the same + # at the start and at the end of patching. Also issue a + # warning/throw about mixed lineends (is it really needed?) + + hunks = iter(hunks) + + srclineno = 1 + + lineends = {'\n':0, '\r\n':0, '\r':0} + def get_line(): + """ + local utility function - return line from source stream + collecting line end statistics on the way + """ + line = instream.readline() + # 'U' mode works only with text files + if line.endswith("\r\n"): + lineends["\r\n"] += 1 + elif line.endswith("\n"): + lineends["\n"] += 1 + elif line.endswith("\r"): + lineends["\r"] += 1 + return line + + for hno, h in enumerate(hunks): + debug("hunk %d" % (hno+1)) + # skip to line just before hunk starts + while srclineno < h.startsrc: + yield get_line() + srclineno += 1 + + for hline in h.text: + # todo: check \ No newline at the end of file + if hline.startswith("-") or hline.startswith("\\"): + get_line() + srclineno += 1 + continue + else: + if not hline.startswith("+"): + get_line() + srclineno += 1 + line2write = hline[1:] + # detect if line ends are consistent in source file + if sum([bool(lineends[x]) for x in lineends]) == 1: + newline = [x for x in lineends if lineends[x] != 0][0] + yield line2write.rstrip("\r\n")+newline + else: # newlines are mixed + yield line2write + + for line in instream: + yield line + + + def write_hunks(self, srcname, tgtname, hunks): + src = open(srcname, "rb") + tgt = open(tgtname, "wb") + + debug("processing target file %s" % tgtname) + + tgt.writelines(self.patch_stream(src, hunks)) + + tgt.close() + src.close() + return True + + + def _get_file_idx(self, filename, source=None): + """ Detect index of given filename within patch. + + :param filename: + :param source: search filename among sources (True), + targets (False), or both (None) + :returns: int or None + """ + filename = abspath(filename) + if source == True or source == None: + for i,fnm in enumerate(self.source): + if filename == abspath(fnm): + return i + if source == False or source == None: + for i,fnm in enumerate(self.target): + if filename == abspath(fnm): + return i + + + + +from optparse import OptionParser +from os.path import exists +import sys + +if __name__ == "__main__": + opt = OptionParser(usage="%prog [options] unipatch-file", version="python-patch %s" % __version__) + opt.add_option("--debug", action="store_true", dest="debugmode", help="debug mode") + (options, args) = opt.parse_args() + + if not args: + opt.print_version() + opt.print_help() + sys.exit() + debugmode = options.debugmode + patchfile = args[0] + if not exists(patchfile) or not isfile(patchfile): + sys.exit("patch file does not exist - %s" % patchfile) + + + if debugmode: + loglevel = logging.DEBUG + logformat = "%(levelname)8s %(message)s" + else: + loglevel = logging.INFO + logformat = "%(message)s" + logger.setLevel(loglevel) + loghandler.setFormatter(logging.Formatter(logformat)) + + + + patch = fromfile(patchfile) + #pprint(patch) + patch.apply() + + # todo: document and test line ends handling logic - patch.py detects proper line-endings + # for inserted hunks and issues a warning if patched file has incosistent line ends diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/Django-1.3.tar.gz Binary file virtualenv/res/src/Django-1.3.tar.gz has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/Imaging-1.1.7.tar.gz Binary file virtualenv/res/src/Imaging-1.1.7.tar.gz has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/JCC-2.6-py2.6-win32.egg Binary file virtualenv/res/src/JCC-2.6-py2.6-win32.egg has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/PIL-1.1.7.win32-py2.6.exe Binary file virtualenv/res/src/PIL-1.1.7.win32-py2.6.exe has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/django-extensions-0.6.tar.gz Binary file virtualenv/res/src/django-extensions-0.6.tar.gz has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/django-haystack-v1.2.2.tar.gz Binary file virtualenv/res/src/django-haystack-v1.2.2.tar.gz has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/httplib2-0.6.0.tar.gz Binary file virtualenv/res/src/httplib2-0.6.0.tar.gz has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/lucene-3.0.2-py2.6-win32.egg Binary file virtualenv/res/src/lucene-3.0.2-py2.6-win32.egg has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/lxml-2.2.2-py2.6-win32.egg Binary file virtualenv/res/src/lxml-2.2.2-py2.6-win32.egg has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/lxml-2.3.tar.bz2 Binary file virtualenv/res/src/lxml-2.3.tar.bz2 has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/psycopg2-2.4.tar.gz Binary file virtualenv/res/src/psycopg2-2.4.tar.gz has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/pylucene-3.1.0-1-src.tar.gz Binary file virtualenv/res/src/pylucene-3.1.0-1-src.tar.gz has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/south-0.7.3.tar.gz Binary file virtualenv/res/src/south-0.7.3.tar.gz has changed diff -r 000000000000 -r 896db0083b76 virtualenv/res/src/whoosh-1.8.3.tar.bz2 Binary file virtualenv/res/src/whoosh-1.8.3.tar.bz2 has changed diff -r 000000000000 -r 896db0083b76 virtualenv/web/.keepme diff -r 000000000000 -r 896db0083b76 virtualenv/web/create_python_env.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/virtualenv/web/create_python_env.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,74 @@ +""" +Call this like ``python create_python_env.py``; it will +refresh the project-boot.py script + +-prerequisite: + +- virtualenv +- distribute +- psycopg2 requires the PostgreSQL libpq libraries and the pg_config utility + +- python project-boot.py --no-site-packages --clear --ignore-packages=MYSQL --type-install=local +- For Linux : +python project-boot.py --unzip-setuptools --no-site-packages --ignore-packages=MYSQL --clear --type-install=local + +""" + +import os +import subprocess +import re +import sys + + +here = os.path.dirname(os.path.abspath(__file__)) +base_dir = here +script_name = os.path.join(base_dir, 'project-boot.py') + +import virtualenv + +# things to install +# - psycopg2 -> pip +# - PIL -> pip +# - pyxml -> pip +# - 4Suite-xml - easy_install ftp://ftp.4suite.org/pub/4Suite/4Suite-XML-1.0.2.tar.bz2 +# - pylucene - script + +src_base = os.path.abspath(os.path.join(here,"..","res","src")).replace("\\","/") +lib_path = os.path.abspath(os.path.join(here,"..","res","lib")).replace("\\","/") +patch_path = os.path.abspath(os.path.join(here,"res","patch")).replace("\\","/") + + +EXTRA_TEXT = "import sys\n" +EXTRA_TEXT += "sys.path.append('%s')\n" % (lib_path) +EXTRA_TEXT += "sys.path.append('%s')\n" % (os.path.abspath(os.path.join(here,"res")).replace("\\","/")) +EXTRA_TEXT += "from res_create_env import generate_install_methods\n" +EXTRA_TEXT += "adjust_options, extend_parser, after_install = generate_install_methods(path_locations, '%s', Logger, call_subprocess)\n" % (src_base) + + + +#f = open(os.path.join(os.path. os.path.join(os.path.dirname(os.path.abspath(__file__)),"res"),'res_create_env.py'), 'r') +#EXTRA_TEXT += f.read() +#EXTRA_TEXT += "\n" +#EXTRA_TEXT += "RES_ENV = ResourcesEnv('%s')\n" % (src_base) + +def main(): + python_version = ".".join(map(str,sys.version_info[0:2])) + text = virtualenv.create_bootstrap_script(EXTRA_TEXT, python_version=python_version) + if os.path.exists(script_name): + f = open(script_name) + cur_text = f.read() + f.close() + else: + cur_text = '' + print 'Updating %s' % script_name + if cur_text == 'text': + print 'No update' + else: + print 'Script changed; updating...' + f = open(script_name, 'w') + f.write(text) + f.close() + +if __name__ == '__main__': + main() + diff -r 000000000000 -r 896db0083b76 virtualenv/web/res/res_create_env.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/virtualenv/web/res/res_create_env.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,45 @@ +import platform + +from lib_create_env import lib_generate_install_methods, install_pylucene, install_psycopg2 + +system_str = platform.system() + +if system_str == 'Windows': + INSTALLS = [ + ('JCC','easy_install',None,None), + ('PSYCOPG2',install_psycopg2,None,None), + ('PYLUCENE','easy_install',None,None), + ] +else: + INSTALLS = [ + ('PYLUCENE',install_pylucene,None,None), + ('PSYCOPG2', 'pip', None, None), + ] + +if system_str == 'Linux': + INSTALLS.extend([ + ('DISTRIBUTE', 'pip', None, None), + ]) + +INSTALLS.extend([ #(key,method, option_str, dict_extra_env) + ('PIL', 'easy_install', None, None), + ('DJANGO','pip', None, None), + ('DJANGO-EXTENSIONS', 'pip', None, None), + ('HTTPLIB2', 'pip', None, None), + ('SOUTH', 'pip', None, None), + ('WHOOSH', 'pip', None, None), + ('HAYSTACK', 'pip', None, None), +]) + +if system_str == "Darwin": + INSTALLS.extend([ + ('LXML', 'easy_install', None, {'STATIC_DEPS': 'true', 'LIBXML2_VERSION': '2.7.8', 'LIBXSLT_VERSION': '1.1.26', 'LIBICONV_VERSION': '1.13.1'}), + ]) +else: + INSTALLS.extend([ + ('LXML', 'easy_install', None, None), + ]) + + +def generate_install_methods(path_locations, src_base, Logger, call_subprocess): + return lib_generate_install_methods(path_locations, src_base, Logger, call_subprocess, INSTALLS) diff -r 000000000000 -r 896db0083b76 web/.keepme diff -r 000000000000 -r 896db0083b76 web/hdabo/.htaccess.mod_python.tmpl --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/.htaccess.mod_python.tmpl Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,11 @@ +SetHandler python-program +PythonHandler ldt.core.handlers.modpython +SetEnv DJANGO_SETTINGS_MODULE ldtplatform.settings +PythonInterpreter platform +PythonOption django.root /~wakimd/platform/ldtplatform +PythonOption virtualenv.activate_path /iridata/users/wakimd/Env/Efculture/bin/activate_this.py +PythonDebug on +PythonPath "['/iridata/users/wakimd/Env/Efculture/lib/python2.6/sites-packages'] + sys.path" +Header set Pragma "no-cache" +Header set Cache-Control "no-cache" +Header set Expires "-1" diff -r 000000000000 -r 896db0083b76 web/hdabo/.htaccess.mod_wsgi.tmpl --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/.htaccess.mod_wsgi.tmpl Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,19 @@ + +SetEnv DJANGO_SETTINGS_MODULE hdabo.settings +SetEnv PROJECT_PATH /Users/ymh/dev/workspace/hdabo/web +SetEnv PYTHON_PATH /Users/ymh/dev/workspace/hdabo/virtualenv/web/env/hdabo/lib/python2.6/site-packages + +Options ExecCGI FollowSymLinks +SetHandler wsgi-script + +#if defined in global definition +#defined with WSGIDaemonProcess +#WSGIProcessGroup platform + +RewriteEngine On +RewriteCond %{REQUEST_FILENAME} !-f +RewriteRule ^(.*)$ /~ymh/hdabo/hdabo/modwsgi.wsgi/$1 [QSA,PT,L] + +Header set Pragma "no-cache" +Header set Cache-Control "no-cache" +Header set Expires "-1" diff -r 000000000000 -r 896db0083b76 web/hdabo/__init__.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/__init__.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,16 @@ +VERSION = (0, 1, 0, "final", 0) + + +def get_version(): + version = '%s.%s' % (VERSION[0], VERSION[1]) + if VERSION[2]: + version = '%s.%s' % (version, VERSION[2]) + if VERSION[3:] == ('alpha', 0): + version = '%s pre-alpha' % version + else: + if VERSION[3] != 'final': + version = '%s %s %s' % (version, VERSION[3], VERSION[4]) + return version + + +__version__ = get_version() diff -r 000000000000 -r 896db0083b76 web/hdabo/admin.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/admin.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,15 @@ +from django.contrib import admin + +from hdabo.models import Author,Datasheet,DocumentFormat,Domain,Organisation,Tag,TagCategory,TaggedSheet,TimePeriod + +admin.site.register(Author) +admin.site.register(Datasheet) +admin.site.register(DocumentFormat) +admin.site.register(Domain) +admin.site.register(Organisation) +admin.site.register(Tag) +admin.site.register(TagCategory) +admin.site.register(TaggedSheet) +admin.site.register(TimePeriod) + + diff -r 000000000000 -r 896db0083b76 web/hdabo/config.py.tmpl --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/config.py.tmpl Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,49 @@ +import os + +BASE_DIR = os.path.dirname(os.path.abspath(__file__)).rstrip("/")+"/" +BASE_URL = '/~ymh/hdabo/' +WEB_URL = 'http://localhost' + +# Absolute filesystem path to the directory that will hold user-uploaded files. +# Example: "/home/media/media.lawrence.com/media/" +MEDIA_ROOT = os.path.abspath(BASE_DIR + "../static/media/") + +# URL that handles the media served from MEDIA_ROOT. Make sure to use a +# trailing slash. +# Examples: "http://media.lawrence.com/media/", "http://example.com/media/" +MEDIA_URL = BASE_URL + "static/media/" + +# Absolute path to the directory static files should be collected to. +# Don't put anything in this directory yourself; store your static files +# in apps' "static/" subdirectories and in STATICFILES_DIRS. +# Example: "/home/media/media.lawrence.com/static/" +STATIC_ROOT = os.path.abspath(BASE_DIR + "../static/site/") + +# URL prefix for static files. +# Example: "http://media.lawrence.com/static/" +STATIC_URL = BASE_URL + "static/site/" + +# URL prefix for admin static files -- CSS, JavaScript and images. +# Make sure to use a trailing slash. +# Examples: "http://foo.com/static/admin/", "/static/admin/". +ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/' + +# Additional locations of static files +STATICFILES_DIRS = ( + # Put strings here, like "/home/html/static" or "C:/www/django/static". + # Always use forward slashes, even on Windows. + # Don't forget to use absolute paths, not relative paths. +) + + +DATABASES = { + 'default': { + 'ENGINE': 'django.db.backends.postgresql_psycopg2', # Add 'postgresql_psycopg2', 'postgresql', 'mysql', 'sqlite3' or 'oracle'. + 'NAME': 'hdabo', # Or path to database file if using sqlite3. + 'USER': 'iri', # Not used with sqlite3. + 'PASSWORD': 'iri', # Not used with sqlite3. + 'HOST': 'localhost', # Set to empty string for localhost. Not used with sqlite3. + 'PORT': '5432', # Set to empty string for default. Not used with sqlite3. + } +} + diff -r 000000000000 -r 896db0083b76 web/hdabo/manage.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/manage.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,14 @@ +#!/usr/bin/env python +from django.core.management import execute_manager +import imp +try: + imp.find_module('settings') # Assumed to be in the same directory. +except ImportError: + import sys + sys.stderr.write("Error: Can't find the file 'settings.py' in the directory containing %r. It appears you've customized things.\nYou'll have to run django-admin.py, passing it your settings module.\n" % __file__) + sys.exit(1) + +import settings + +if __name__ == "__main__": + execute_manager(settings) diff -r 000000000000 -r 896db0083b76 web/hdabo/management/__init__.py diff -r 000000000000 -r 896db0083b76 web/hdabo/management/commands/__init__.py diff -r 000000000000 -r 896db0083b76 web/hdabo/management/commands/importcsv.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/management/commands/importcsv.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,79 @@ +# -*- coding: utf-8 -*- +''' +Created on May 25, 2011 + +@author: ymh +''' +from django.core.management.base import BaseCommand, CommandError +from hdabo.models import (Author, Datasheet, DocumentFormat, Domain, Organisation, + Tag, TagCategory, TaggedSheet, TimePeriod, Location) +import csv + +class Command(BaseCommand): + ''' + classdocs + ''' + args = '' + help = 'Import of a csv file for hdabo' + + def create_domain_period(self, row_value, klass, school_period): + res_list = [] + if not row_value: + return res_list + for label_str in [dstr.strip() for dstr in row_value.decode("latin-1").split('\x0b')]: + if label_str: + res_obj, created = klass.objects.get_or_create(label=label_str, school_period=school_period, defaults={"label":label_str,"school_period":school_period}) #@UnusedVariable + res_list.append(res_obj) + return res_list + + def handle(self, *args, **options): + for csv_path in args: + with open(csv_path, 'rU') as csv_file: + reader = csv.DictReader(csv_file, delimiter=";") + #Author Datasheet DocumentFormat Domain Organisation Tag TagCategory TaggedSheet TimePeriod + + # ID;Titre;Url;Desc;Tag;Org;Org_Home;Domaine;Periode1;Periode2;Periode3;Theme2;Theme3;Sousdom;Ville;Insee;Format;Auteur;Datcre;Datmaj;Comment + for row in reader: + # start transaction + if row[u'Autheur']: + author_array = row[u'Autheur'].split(" ") + author, created = Author.objects.get_or_create(hda_id=row[u'Autheur'], defaults={"firstname":author_array[0], "lastname":author_array[1]}) + else: + author = None + + if row[u"Org"]: + org, created = Organisation.objects.get_or_create(hda_id=row[u'Org'], defaults={"name":row[u'Org'], "website" : row[u'Org_Home']}) + else: + org = None + + if row[u"Ville"]: + loc, created = Location.objects.get_or_create(insee=row[u'Insee'], defaults={"name": row[u'Ville'], "insee": row[u'Insee']}) + else: + loc = None + + if row[u"Format"]: + format, created = DocumentFormat.objects.get_or_create(label=row[u'Format'], defaults={"label": row[u"Format"]}) + else: + format = None + + domains = self.create_domain_period(row[u"Domaine"], Domain, Domain.DOMAIN_PERIOD_DICT[u'Global']) + + primary_periods = self.create_domain_period(row[u"Periode1"], TimePeriod, TimePeriod.TIME_PERIOD_DICT[u'Primaire']) + college_periods = self.create_domain_period(row[u"Periode2"], TimePeriod, TimePeriod.TIME_PERIOD_DICT[u'Collège']) + highschool_periods = self.create_domain_period(row[u"Periode3"], TimePeriod, TimePeriod.TIME_PERIOD_DICT[u'Lycée']) + + primary_themes = self.create_domain_period(row[u"Sousdom"], Domain, Domain.DOMAIN_PERIOD_DICT[u'Primaire']) + college_themes = self.create_domain_period(row[u"Theme2"], Domain, Domain.DOMAIN_PERIOD_DICT[u'Collège']) + highschool_themes = self.create_domain_period(row[u"Theme3"], Domain, Domain.DOMAIN_PERIOD_DICT[u'Lycée']) + + + + tags = [] + if row[u'Tag']: + for tag in [t.strip().lower() for t in row[u'Tag'].split(u";")]: + tag_obj, created = Tag.objects.get_or_create(label=tag, defaults={"label":tag, "original_label":tag}) + + + + + #commit transaction \ No newline at end of file diff -r 000000000000 -r 896db0083b76 web/hdabo/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/models.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,195 @@ +# -*- coding: utf-8 -*- + +from django.contrib.auth.models import User +from django.db import models +from hdabo.utils import Property +import datetime + +class Organisation(models.Model): + hda_id = models.CharField(max_length=512, unique=True, blank=False, null=False) + name = models.CharField(max_length=512, unique=False, blank=False, null=False) + location = models.CharField(max_length=512, unique=False, blank=True, null=True) + website = models.CharField(max_length=1024, unique=False, blank=True, null=True) + + +class Author(models.Model): + hda_id = models.CharField(max_length=512, unique=True, blank=False, null=False) + lastname = models.CharField(max_length=512, unique=False, blank=True, null=True) + firstname = models.CharField(max_length=512, unique=False, blank=True, null=True) + +class TimePeriod(models.Model): + TIME_PERIOD_CHOICES = ( + (1,u'Primaire'), + (2,u'Collège'), + (3,u'Lycée'), + ) + TIME_PERIOD_DICT = { + u'Primaire': 1, + u'Collège': 2, + u'Lycée': 3, + } + label = models.CharField(max_length=512, unique=False, blank=False, null=False) + school_period = models.IntegerField(choices=TIME_PERIOD_CHOICES) + + class Meta: + unique_together = ("label", "school_period") + + def __unicode__(self): + return unicode(self.label) + + +class Domain(models.Model): + DOMAIN_PERIOD_CHOICES = ( + (0,u'Global'), + (1,u'Primaire'), + (2,u'Collège'), + (3,u'Lycée'), + ) + DOMAIN_PERIOD_DICT = { + u'Global': 0, + u'Primaire': 1, + u'Collège': 2, + u'Lycée': 3, + } + label = models.CharField(max_length=512, unique=False, blank=False, null=False) + school_period = models.IntegerField(choices=DOMAIN_PERIOD_CHOICES) + + class Meta: + unique_together = ("label", "school_period") + + def __unicode__(self): + return unicode(self.label) + + +class DocumentFormat(models.Model): + label = models.CharField(max_length=512, unique=True, blank=False, null=False) + + def __unicode__(self): + return unicode(self.label) + +class Tag(models.Model): + label = models.CharField(max_length=1024, unique=True, blank=False, null=False) + original_label = models.CharField(max_length=1024, unique=False, blank=True, null=True, editable=False) + alias = models.CharField(max_length=1024, unique=False, blank=True, null=True) + wikipedia_url = models.URLField(verify_exists=False, max_length=512, blank=True, null=True) + dbpedia_uri = models.URLField(verify_exists=False, max_length=512, blank=True, null=True) + wikipedia_activated = models.BooleanField(default=False) + + +class TagCategory(models.Model): + label = models.CharField(max_length=512, unique=True, blank=False, null=False) + + def __unicode__(self): + return unicode(self.label) + +class Location(models.Model): + name = models.CharField(max_length=512, unique=False, blank=False, null=False) + insee = models.CharField(max_length=5, unique=True, blank=False, null=False) + + def __unicode__(self): + return unicode("%s : %s"%(self.name, self.insee)) + +class Datasheet(models.Model): + hda_id = models.CharField(max_length=512, unique=True, blank=False, null=False) + author = models.ForeignKey(Author, null=True, blank=True) + organisation = models.ForeignKey(Organisation) + title = models.CharField(max_length=2048, unique=False, blank=False, null=False) + description = models.TextField(blank=True, null=True) + url = models.URLField(verify_exists=False, max_length=512, blank=True, null=True) + domains = models.ManyToManyField(Domain, limit_choices_to={'school_period':Domain.DOMAIN_PERIOD_DICT[u'Global']}, related_name="datasheets") + primary_periods = models.ManyToManyField(TimePeriod, limit_choices_to={'school_period':TimePeriod.TIME_PERIOD_DICT[u'Primaire']}, related_name="primary_periods_datasheets") + college_periods = models.ManyToManyField(TimePeriod, limit_choices_to={'school_period':TimePeriod.TIME_PERIOD_DICT[u'Collège']}, related_name="college_periods_datasheets") + highschool_periods = models.ManyToManyField(TimePeriod, limit_choices_to={'school_period':TimePeriod.TIME_PERIOD_DICT[u'Lycée']}, related_name="highschool_periods_datasheets") + primary_themes = models.ManyToManyField(Domain, limit_choices_to={'school_period':Domain.DOMAIN_PERIOD_DICT[u'Primaire']}, related_name="primary_themes_datasheets") + college_themes = models.ManyToManyField(Domain, limit_choices_to={'school_period':Domain.DOMAIN_PERIOD_DICT[u'Collège']}, related_name="college_themes_datasheets") + highschool_themes = models.ManyToManyField(Domain, limit_choices_to={'school_period':Domain.DOMAIN_PERIOD_DICT[u'Lycée']}, related_name="highschool_themes_datasheets") + town = models.ForeignKey(Location, null=True, blank=True) + format = models.ForeignKey(DocumentFormat, null=True, blank=True) + original_creation_date = models.DateField() + original_modification_date = models.DateField() + modification_datetime = models.DateTimeField(auto_now=True) + validation_date = models.DateTimeField() + validated = models.BooleanField() + validator = models.ForeignKey(User) + tags = models.ManyToManyField(Tag, through='TaggedSheet') + + + def validate(self, user): + self.validation_date = datetime.datetime.now() + self.validated = True + self.validator = user + self.save() + + def unvalidate(self): + self.validation_date = datetime.datetime.min + self.validated = False + self.validator = None + self.save() + + @Property + def domain_text(): #@NoSelf + def fget(self): + return "; ".join([d.label for d in self.domains.all()]) + + return locals() + + @Property + def primary_periods_text(): #@NoSelf + def fget(self): + return "; ".join([d.label for d in self.primary_periods.all()]) + + return locals() + + + @Property + def college_periods_text(): #@NoSelf + def fget(self): + return "; ".join([d.label for d in self.college_periods.all()]) + + return locals() + + @Property + def highschool_periods_text(): #@NoSelf + def fget(self): + return "; ".join([d.label for d in self.highschool_periods.all()]) + + return locals() + + + @Property + def primary_themes_text(): #@NoSelf + def fget(self): + return "; ".join([d.label for d in self.primary_themes.all()]) + + return locals() + + @Property + def college_themes_text(): #@NoSelf + def fget(self): + return "; ".join([d.label for d in self.college_themes.all()]) + + return locals() + + @Property + def highschool_themes_text(): #@NoSelf + def fget(self): + return "; ".join([d.label for d in self.highschool_themes.all()]) + + return locals() + + @Property + def town_text(): #@NoSelf + def fget(self): + return self.town.name if self.town else "" + + return locals() + + +class TaggedSheet(models.Model): + datasheet = models.ForeignKey(Datasheet) + tag = models.ForeignKey(Tag) + original_order = models.IntegerField() + order = models.IntegerField() + index_note = models.FloatField() + categories = models.ManyToManyField(TagCategory) + \ No newline at end of file diff -r 000000000000 -r 896db0083b76 web/hdabo/modwsgi.wsgi --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/modwsgi.wsgi Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,30 @@ +import os, sys, site + +def application(environ, start_response): + + global g_env_set + + if 'g_env_set' not in globals() or not g_env_set: + os.environ['DJANGO_SETTINGS_MODULE'] = environ['DJANGO_SETTINGS_MODULE'] + + prev_sys_path = list(sys.path) + + sys.path.append(environ['PROJECT_PATH']) + for path in environ.get('PYTHON_PATH',"").split(os.pathsep): + if path: + site.addsitedir(path) + + new_sys_path = [] + for item in list(sys.path): + if item not in prev_sys_path and item not in new_sys_path: + new_sys_path.append(item) + sys.path.remove(item) + sys.path[:0] = new_sys_path + g_env_set = True + + import django.core.handlers.wsgi + + _application = django.core.handlers.wsgi.WSGIHandler() + + return _application(environ, start_response) + diff -r 000000000000 -r 896db0083b76 web/hdabo/settings.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/settings.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,146 @@ +# Django settings for hdabo project. + +DEBUG = True +TEMPLATE_DEBUG = DEBUG + +ADMINS = ( + # ('Your Name', 'your_email@example.com'), +) + +MANAGERS = ADMINS + +DATABASES = { + 'default': { + 'ENGINE': 'django.db.backends.', # Add 'postgresql_psycopg2', 'postgresql', 'mysql', 'sqlite3' or 'oracle'. + 'NAME': '', # Or path to database file if using sqlite3. + 'USER': '', # Not used with sqlite3. + 'PASSWORD': '', # Not used with sqlite3. + 'HOST': '', # Set to empty string for localhost. Not used with sqlite3. + 'PORT': '', # Set to empty string for default. Not used with sqlite3. + } +} + +# Local time zone for this installation. Choices can be found here: +# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name +# although not all choices may be available on all operating systems. +# On Unix systems, a value of None will cause Django to use the same +# timezone as the operating system. +# If running in a Windows environment this must be set to the same as your +# system time zone. +TIME_ZONE = 'America/Chicago' + +# Language code for this installation. All choices can be found here: +# http://www.i18nguy.com/unicode/language-identifiers.html +LANGUAGE_CODE = 'en-us' + +SITE_ID = 1 + +# If you set this to False, Django will make some optimizations so as not +# to load the internationalization machinery. +USE_I18N = True + +# If you set this to False, Django will not format dates, numbers and +# calendars according to the current locale +USE_L10N = True + +# Absolute filesystem path to the directory that will hold user-uploaded files. +# Example: "/home/media/media.lawrence.com/media/" +MEDIA_ROOT = '' + +# URL that handles the media served from MEDIA_ROOT. Make sure to use a +# trailing slash. +# Examples: "http://media.lawrence.com/media/", "http://example.com/media/" +MEDIA_URL = '' + +# Absolute path to the directory static files should be collected to. +# Don't put anything in this directory yourself; store your static files +# in apps' "static/" subdirectories and in STATICFILES_DIRS. +# Example: "/home/media/media.lawrence.com/static/" +STATIC_ROOT = '' + +# URL prefix for static files. +# Example: "http://media.lawrence.com/static/" +STATIC_URL = '/static/' + +# URL prefix for admin static files -- CSS, JavaScript and images. +# Make sure to use a trailing slash. +# Examples: "http://foo.com/static/admin/", "/static/admin/". +ADMIN_MEDIA_PREFIX = '/static/admin/' + +# Additional locations of static files +STATICFILES_DIRS = ( + # Put strings here, like "/home/html/static" or "C:/www/django/static". + # Always use forward slashes, even on Windows. + # Don't forget to use absolute paths, not relative paths. +) + +# List of finder classes that know how to find static files in +# various locations. +STATICFILES_FINDERS = ( + 'django.contrib.staticfiles.finders.FileSystemFinder', + 'django.contrib.staticfiles.finders.AppDirectoriesFinder', +# 'django.contrib.staticfiles.finders.DefaultStorageFinder', +) + +# Make this unique, and don't share it with anybody. +SECRET_KEY = '1anp2v#36%z(pahi5ytghik&-eg8t96)&$t)b%i=7@%=)u$pyn' + +# List of callables that know how to import templates from various sources. +TEMPLATE_LOADERS = ( + 'django.template.loaders.filesystem.Loader', + 'django.template.loaders.app_directories.Loader', +# 'django.template.loaders.eggs.Loader', +) + +MIDDLEWARE_CLASSES = ( + 'django.middleware.common.CommonMiddleware', + 'django.contrib.sessions.middleware.SessionMiddleware', + 'django.middleware.csrf.CsrfViewMiddleware', + 'django.contrib.auth.middleware.AuthenticationMiddleware', + 'django.contrib.messages.middleware.MessageMiddleware', +) + +ROOT_URLCONF = 'hdabo.urls' + +TEMPLATE_DIRS = ( + # Put strings here, like "/home/html/django_templates" or "C:/www/django/templates". + # Always use forward slashes, even on Windows. + # Don't forget to use absolute paths, not relative paths. +) + +INSTALLED_APPS = ( + 'django.contrib.auth', + 'django.contrib.contenttypes', + 'django.contrib.sessions', + 'django.contrib.sites', + 'django.contrib.messages', + 'django.contrib.staticfiles', + 'django.contrib.admin', + 'hdabo', +) + +# A sample logging configuration. The only tangible logging +# performed by this configuration is to send an email to +# the site admins on every HTTP 500 error. +# See http://docs.djangoproject.com/en/dev/topics/logging for +# more details on how to customize your logging configuration. +LOGGING = { + 'version': 1, + 'disable_existing_loggers': False, + 'handlers': { + 'mail_admins': { + 'level': 'ERROR', + 'class': 'django.utils.log.AdminEmailHandler' + } + }, + 'loggers': { + 'django.request': { + 'handlers': ['mail_admins'], + 'level': 'ERROR', + 'propagate': True, + }, + } +} + + +from hdabo.config import * #@UnusedWildImport diff -r 000000000000 -r 896db0083b76 web/hdabo/static/.keepme diff -r 000000000000 -r 896db0083b76 web/hdabo/static/hdabo/.keepme diff -r 000000000000 -r 896db0083b76 web/hdabo/static/hdabo/css/.keepme diff -r 000000000000 -r 896db0083b76 web/hdabo/static/hdabo/img/.keepme diff -r 000000000000 -r 896db0083b76 web/hdabo/static/hdabo/js/.keepme diff -r 000000000000 -r 896db0083b76 web/hdabo/tests/__init__.py diff -r 000000000000 -r 896db0083b76 web/hdabo/tests/models.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/tests/models.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,12 @@ +from django.test import TestCase + +from hdabo.models import Datasheet + +class DatasheetTest(TestCase): + fixtures = ['datasheet_text'] + + def setUp(self): + self.datasheet_instance = Datasheet.objects.get(hda_id = 1) + + def test_domain_text(self): + self.assertEqual("Arts du quotidien; Arts du visuel",self.datasheet_instance.domain_text) diff -r 000000000000 -r 896db0083b76 web/hdabo/urls.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/urls.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,17 @@ +from django.conf.urls.defaults import patterns, include, url +from django.contrib import admin + +# Uncomment the next two lines to enable the admin: +admin.autodiscover() + +urlpatterns = patterns('', + # Examples: + # url(r'^$', 'hdabo.views.home', name='home'), + # url(r'^hdabo/', include('hdabo.foo.urls')), + + # Uncomment the admin/doc line below to enable admin documentation: + # url(r'^admin/doc/', include('django.contrib.admindocs.urls')), + + # Uncomment the next line to enable the admin: + url(r'^admin/', include(admin.site.urls)), +) diff -r 000000000000 -r 896db0083b76 web/hdabo/utils.py --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/web/hdabo/utils.py Thu May 26 17:15:39 2011 +0200 @@ -0,0 +1,6 @@ + +### +# allow to declare a property as a decorator +### +def Property(func): + return property(**func()) \ No newline at end of file diff -r 000000000000 -r 896db0083b76 web/static/.keepme diff -r 000000000000 -r 896db0083b76 web/static/media/.keepme diff -r 000000000000 -r 896db0083b76 web/static/site/.keepme