* Move importcollection and importmetacategories commands to the generic project
* Add site synchronisation script
* Add some test data
* remove dev settings
* Add setup file to build the application
* update .hgignore
--- a/.hgignore Tue Jun 26 15:55:08 2018 +0200
+++ b/.hgignore Wed Jun 27 16:00:29 2018 +0200
@@ -3,16 +3,13 @@
^virtualenv/
^__pychache__/
-^src/iconolab_mcc/settings/dev\.py$
+^src/iconolab_episteme/settings/dev\.py$
node_modules/
\.css.map$
\.js.map$
-^src/iconolab/static/iconolab/js/iconolab-bundle/dist/
\.orig$
-^src_js/iconolab-bundle/dist/
-
\.log$
^web/*
^\.pydevproject$
@@ -38,6 +35,11 @@
^src/.direnv
^src/.envrc$
-^data/
+^run/
-^.vscode
\ No newline at end of file
+^.vscode
+^sbin/sync/.direnv
+^sbin/sync/.envrc
+^sbin/sync/.vscode
+^sbin/sync/.vagrant
+^sbin/sync/fabric.py$
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/README.md Wed Jun 27 16:00:29 2018 +0200
@@ -0,0 +1,141 @@
+## ICONOLAB-EPSITEME
+
+### 1. Configuration and setup for development
+
+### virtualenv
+
+- Install pip
+- Create a virtualenv for the project (using virtualenvwrapper is a good idea if possible). Python version is 3.6.5
+
+Example
+```
+virtualenv -p python3.6 ./virtualenv
+. virtualenv/bin/activate
+
+cd src/
+pip install -r requirements/dev.txt
+```
+
+
+### Django project setup
+
+- Install iconolab
+ pip install -e path/iconolab
+
+- Copy iconolab/src/settings/dev.py.tmpl into iconolab-episteme/src/iconolab_mcc/settings/dev.py, adapt content to configuration
+- cd into iconolab-episteme/src folder and run
+
+ python manage.py migrate
+
+to create database tables
+
+- Run
+
+ python manage.py createsuperuser
+
+to create an admin user
+
+
+- Collect static files
+
+ python manage.py collectstatic
+
+
+- Use Docker adress for HOST in settings
+
+- don't use os.path.join(BASE_DIR, 'media') in settings
+
+- make JS_DEV_MODE afalse (if not, it will not connect correctly on server)
+
+
+### Docker
+
+Migrate the file docker-compose.yml in iconolab-episteme project
+
+- Create containers
+
+ docker-compose up
+
+- Run containers
+
+ docker-compose start
+
+
+### Elasticsearch
+
+Some objects in Iconolab are indexed and searched using ElasticSearch. You need to configure the elasticsearch parameters (see dev.py.tmpl, ELASTICSEARCH_DSL) and run:
+
+ python manage.py search_index --rebuild
+
+
+### 2. Python server
+
+- cd into the iconolab-episteme/src folder and run
+
+ python manage.py runserver
+
+By default, the app is accessible through http://127.0.0.1:8000/home
+
+
+### 3. Importing initial data from CSV
+
+Make sure to have the following in the same folder:
+
+* All the images to import. The image names must match their respective item inventory number.
+* A csv file that contains the metadata for the items you will import
+* A json file for initializing the collection in the database. (Optional if you want to import images in an existing collection)
+* A json file for the metacategories that will be linked to the collection.
+* Ensure the folder settings.MEDIA_ROOT+/uploads/ exists
+
+The following django manage.py command is used to import collection data and images:
+
+```
+python manage.py importimages <:export-csv-path> --delimiter <:delimiter> --encoding <:encoding> --collection-json <:collection_fixture_FILENAME> (OR --collection-id <:collection_id> if collection already exists in db) --metacategories-json <:metacategories_json_FILENAME>
+```
+
+Options:
+- ```--delimiter```: the delimiter for the csv file. For special ascii characters add a # before the code. Supported special chars are 9 (tab), 29 (Group separator), 30 (Record separator), 31 (Unit separator)
+- ```--encoding```: the encoding provided if the csv is not in utf-8. Exemple: 8859 for ISO-8859
+- ```--collection-json```: the json file to create the collection from
+- ```--collection-id```: the id of the collection to import into, it must already exist
+- ```--metacategories-json```: the json file to create metacategories on the collection we're importing into
+- ```--jpeg-quality```: the jpeg quality: default to the setting IMG_JPG_DEFAULT_QUALITY
+- ```--no-jpg-conversion```: set to True so the command will not convert the images to jpg. Useful for pre-converted jpeg and especially when importing large image banks
+- ```--img-filename-identifier```: the column from which the command will try to find images in the folder: use keys from the setting IMPORT_FIELDS_DICT. Default is "INV".
+- ```--filename-regexp-prefix```: allows you to customize the way the command try to find images by specifying a regexp pattern to match *before* the identifier provided in img-filename-identifier. Defaults to .*
+- ```--filename-regexp-suffix```: allows you to customize the way the command try to find images by specifying a regexp pattern to match *after* the identifier provided in img-filename-identifier. Defaults to [\.\-_].*
+
+Notes:
+* The export csv path will be used to find everything else (images and fixtures files).
+* If the csv file is not encoded in utf-8, you MUST provide --encoding so the csv file can be read
+* You MUST provide either --collection-json or --collection-id, else the command doesn't know to which collection the objects will belong to.
+* To find all images for a given item, the command will try to match filenames according to the pattern build from the 3 options: filename-regexp-prefix+<value of img-filename-identifier>+filename-regexp-suffix. For instance by default, for an object with an INV of MIC.3.10, the files MIC.3.10.jpg and MIC.3.10.verso.jpg would be matched and linked to the object.
+* The command will first parse the csv, then create the objects in the database (Item and ItemMetadata), then move the images to the settings.MEDIA_ROOT+/uploads/ folder after converting them to JPEG, then create the database objects for the images. The command will ignore any csv row that lacks an image or any csv row that already has a database entry for the collection (by default INV number is used to test if a database entry exists).
+
+
+### 4. Updating already existing data
+
+Another management command allows for editing data using only a .csv file. The command will go through the csv and update the metadatas for every objects it finds in the database with the csv row content.
+
+```
+python manage.py updatecollection --collection-id=<:id> --delimiter=<:delimiter> --encoding=<:encoding>
+```
+
+Options:
+- ```--delimiter```: the delimiter for the csv file. For special ascii characters add a # before the code. Supported special chars are 9 (tab), 29 (Group separator), 30 (Record separator), 31 (Unit separator)
+- ```--encoding```: the encoding provided if the csv is not in utf-8. Exemple: 8859 for ISO-8859
+- ```--collection-id```: the id of the collection to import into, it must already exist
+
+
+ * document the test import command : `python manage.py importimages --collection-json dossierImportMontauban/montauban_collection.json --metacategories-json dossierImportMontauban/montauban_metacategories_import.json --encoding "UTF-8" --delimiter "," dossierImportMontauban/ExportMontauban.csv`
+
+
+### defining a new version
+
+To define a new version, the following steps must be taken.
+* if needed, refresh the `iconolab` version number in `src/setup.py`, in the key `install_requires`.
+* check that the matching tag has been published in the `iconolab` mercurial repository.
+* Update the version number in `src/iconolab_mcc/__init__.py`
+* Create a mercurial tag that matches the new version number
+* push the new commits and tag
+
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/data/eso_collection.json Wed Jun 27 16:00:29 2018 +0200
@@ -0,0 +1,6 @@
+{
+ "name": "eso",
+ "verbose_name": "European Southern Observatory",
+ "description": "L’ESO construit et gère les télescopes astronomiques au sol les plus puissants au monde qui permettent d’importantes découvertes scientifiques.",
+ "image": "foo.jpg"
+}
\ No newline at end of file
Binary file data/eso_logo.jpg has changed
Binary file data/eso_logo.png has changed
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/data/eso_metacategories.json Wed Jun 27 16:00:29 2018 +0200
@@ -0,0 +1,15 @@
+[
+ {
+ "label": "Appel à contribution",
+ "triggers_notifications": 1
+ },{
+ "label": "Appel à expertise",
+ "triggers_notifications": 3
+ },{
+ "label": "Référence"
+ },{
+ "label": "Accord"
+ },{
+ "label": "Désaccord"
+ }
+]
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/sbin/sync/README.md Wed Jun 27 16:00:29 2018 +0200
@@ -0,0 +1,42 @@
+# Deployement scripts for ICONOLAB-MCC
+
+Theses scripts uses `fabric` 2.X.
+
+ * create a virtualenv
+ * `$ pip install -r resquirements.txt`
+
+## Configuration
+
+The scripts are configured in the `fabric.py` file
+ * ` cp fabric.py.tmpl fabric.py`
+ * edit the file to match the expected configuration
+
+## Tasks
+the following task has been defined:
+
+ - create-virtualenv
+ - publish-version
+ - relaunch-server
+
+### relaunch-server
+
+Usage: `$ fab relaunch-server`
+Description: relaunch the webservices after calling the following commands on the server:
+ * `django-admin migrate`
+ * `django-admin collectstatic`
+
+### create-virtualenv
+
+Usage: `$ fab create-virtualenv <hg_tags>`
+Description: Create a virtualenv on the server with the new application version and the matching requirements. This deletes the previous virtualenv first.
+Please note that this command only creates the virtualenv. It does not create the full setup for the application :
+ * the settings file,
+ * the applications orchestration scripts,
+ * the application folders
+
+### publish-version
+
+Usage: `$ fab publish-version <hg_tags>`
+Description: publish the given version of the website. It calls the following commands:
+ * `create-virtualenv <hg_tags>`
+ * `relaunch-server`
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/sbin/sync/Vagrantfile Wed Jun 27 16:00:29 2018 +0200
@@ -0,0 +1,85 @@
+# -*- mode: ruby -*-
+# vi: set ft=ruby :
+
+# All Vagrant configuration is done below. The "2" in Vagrant.configure
+# configures the configuration version (we support older styles for
+# backwards compatibility). Please don't change it unless you know what
+# you're doing.
+Vagrant.configure("2") do |config|
+ # The most common configuration options are documented and commented below.
+ # For a complete reference, please see the online documentation at
+ # https://docs.vagrantup.com.
+
+ # Every Vagrant development environment requires a box. You can search for
+ # boxes at https://vagrantcloud.com/search.
+ config.vm.box = "alpine/alpine64"
+
+ # Disable automatic box update checking. If you disable this, then
+ # boxes will only be checked for updates when the user runs
+ # `vagrant box outdated`. This is not recommended.
+ # config.vm.box_check_update = false
+
+ # Create a forwarded port mapping which allows access to a specific port
+ # within the machine from a port on the host machine. In the example below,
+ # accessing "localhost:8080" will access port 80 on the guest machine.
+ # NOTE: This will enable public access to the opened port
+ # config.vm.network "forwarded_port", guest: 80, host: 8080
+
+ # Create a forwarded port mapping which allows access to a specific port
+ # within the machine from a port on the host machine and only allow access
+ # via 127.0.0.1 to disable public access
+ # config.vm.network "forwarded_port", guest: 80, host: 8080, host_ip: "127.0.0.1"
+
+ # Create a private network, which allows host-only access to the machine
+ # using a specific IP.
+ # config.vm.network "private_network", ip: "172.16.1.10", auto_config: false
+
+ # Create a public network, which generally matched to bridged network.
+ # Bridged networks make the machine appear as another physical device on
+ # your network.
+ # config.vm.network "public_network"
+
+ # Share an additional folder to the guest VM. The first argument is
+ # the path on the host to the actual folder. The second argument is
+ # the path on the guest to mount the folder. And the optional third
+ # argument is a set of non-required options.
+ # config.vm.synced_folder "../data", "/vagrant_data"
+
+ # Provider-specific configuration so you can fine-tune various
+ # backing providers for Vagrant. These expose provider-specific options.
+ # Example for VirtualBox:
+ #
+ # config.vm.provider "virtualbox" do |vb|
+ # # Display the VirtualBox GUI when booting the machine
+ # vb.gui = true
+ #
+ # # Customize the amount of memory on the VM:
+ # vb.memory = "1024"
+ # end
+ #
+ # View the documentation for the provider you are using for more
+ # information on available options.
+
+ # Enable provisioning with a shell script. Additional provisioners such as
+ # Puppet, Chef, Ansible, Salt, and Docker are also available. Please see the
+ # documentation for more information about their specific syntax and use.
+ config.vm.provision "shell", inline: <<-SHELL
+ apk update
+ apk upgrade
+ apk add python3
+ apk add git
+ apk add build-base
+ apk add postgresql-client
+ apk add musl-dev postgresql-dev
+ apk add jpeg-dev zlib-dev
+ apk add python3-dev
+ apk add memcached-dev
+ apk add libmemcached
+ apk add libmemcached-dev
+ apk add linux-dev
+ apk add linux-headers
+ SHELL
+
+ config.vbguest.auto_update = false
+
+end
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/sbin/sync/core.py Wed Jun 27 16:00:29 2018 +0200
@@ -0,0 +1,281 @@
+# -*- coding: utf-8 -*-
+'''
+Created on Feb 20, 2013
+
+@author: ymh
+'''
+# from fabric.api import run, local, env, cd, put, prefix, sudo, lcd
+# from fabric.colors import green
+# from fabric.context_managers import settings
+# from fabric.contrib.files import exists, upload_template
+# from fabric.contrib.project import rsync_project
+# from fabric.tasks import Task
+from fabric import Connection
+import imp
+import os.path
+import re
+import shutil
+import sys
+import urllib.parse
+import requirements
+
+
+# __all__ = ["check_folder_access", "migrate", "collectstatic", "do_relaunch_server",
+# "export_version", "do_sync_web", "create_config", "clean_export_folder",
+# "sync_install_build", "do_create_virtualenv", "clean_rsync_folder", "rsync_export",
+# "do_sync_comp", "get_comp_versions_dict", "SyncComp", "get_src_version", "sync_build",
+# "install_build", "do_create_virtualenv_requirement", "build_src"]
+
+def get_export_path(env, version):
+ base_path = os.path.join(env.base_export_path,env.export_prefix).rstrip("/")
+ return os.path.expanduser(base_path) + "_%s" % (str(version))
+
+def clean_export_folder(path):
+ print("Removing %s" % path)
+ if os.path.isdir(path):
+ shutil.rmtree(path, ignore_errors=True)
+
+def do_export_version(c, path, **export_keys):
+ print("Export version %s : %s" % (path,repr(export_keys)))
+
+ env = c.env
+
+ for export_key, version in export_keys.items():
+ export_path = os.path.join(path,export_key)
+
+ repo_url = env.repos[export_key]['repo']
+ url_part = urllib.parse.urlparse(repo_url)
+ if url_part.scheme or url_part.netloc:
+ # this is a remote repo. Let's clone first
+ clone_path = os.path.join(path,'clone',export_keys[export_key])
+ os.makedirs(clone_path)
+
+ output = c.run('git ls-remote \"%s\"' % repo_url, warn=True)
+ print("OUTPUT %r" % output)
+ scm = "hg" if output.failed else "git"
+ if scm == "hg":
+ output = c.run("hg clone \"%s\" \"%s\"" % (repo_url,clone_path))
+ else:
+ c.run("git clone \"%s\" \"%s\"" % (repo_url,clone_path))
+ else:
+ clone_path = repo_url
+
+ with c.cd(clone_path):
+ # detetct .git or .hg subfolder
+ if os.path.exists(os.path.join(clone_path,".git")):
+ os.makedirs(export_path)
+ cmd_str = "git archive \'%s\' | tar -x -C \"%s\""
+ else:
+ cmd_str = "hg archive -r \'%s\' \"%s\""
+ c.run(cmd_str % (str(version),export_path))
+
+ print("Export version %s done"%repr(export_keys))
+
+def launch_setup_command(c, command_array, path):
+ f = None
+ sys.path.append(path)
+ current_path = os.getcwd()
+ try:
+ os.chdir(path)
+ try:
+ f, pathname, description = imp.find_module("setup", [path])
+ print("launch_setup_command at %s : found setup" % path)
+ setup_mod = imp.load_module("setup", f, pathname, description)
+ print("launch_setup_command at %s : setup loaded" % path)
+ except:
+ e = sys.exc_info()[0]
+ print("Error launching commands %s : %s" % (path, str(e)))
+ raise
+ finally:
+ if f:
+ f.close()
+
+ return setup_mod.launch_setup("setup.py", command_array)
+ finally:
+ os.chdir(current_path)
+
+
+def get_src_dependencies(c, pkg_name, path):
+ print("Get source dependencies at %s" % path)
+ launch_setup_command(c, ['egg_info'], path)
+ egg_requirement_file = os.path.join(path, "%s.egg-info" % pkg_name, "requires.txt")
+ res = []
+ with open(egg_requirement_file) as f:
+ for req in requirements.parse(f):
+ if req.name in c.env['repos']:
+ r_version = req.specs[0][1] if req.specs else 'tip'
+ res.append((req.name, r_version))
+ print("Build source dist at %s done : %r" % (path, res))
+ return res
+
+
+def get_remote_env(c, remotepath, remotevirtualenvpath, application_module, settings_key, settings_module=''):
+ if not settings_module:
+ settings_module = '%s.%s' % (application_module, 'settings')
+ activate_path = os.path.join(remotevirtualenvpath, "bin/activate")
+
+ env = c.env
+ with Connection(env['hosts'][0]) as rconnection:
+ with rconnection.prefix("echo $SHELL && . \"%s\"" % os.path.join(remotevirtualenvpath, "bin/activate")), rconnection.prefix("export PYTHONPATH=\"%s\"" % remotepath):
+ return rconnection.run("DJANGO_SETTINGS_MODULE=%s python -c 'import django.conf;print(django.conf.settings.%s)'" % (settings_module, settings_key)).stdout
+
+
+# def rsync_export(path, remotepath, filters):
+# print("Rsync %s to %s",(path,remotepath))
+
+# filter_option_str = "--progress --stats"
+# if filters:
+# filter_option_str += " " + " ".join(["--filter \"%s\"" % (f) for f in filters])
+
+# run("mkdir -p \"%s\"" % remotepath)
+# rsync_project(remotepath, local_dir=path, extra_opts=filter_option_str, delete=True)
+# print("Rsync %s to %s done",(path,remotepath))
+
+# def clean_rsync_folder(remotepath):
+# print("clean rsync folder %s" % remotepath)
+# run("rm -fr \"%s\"" % remotepath)
+
+def build_src(c, path):
+ print("Build source dist at %s" % path)
+ launch_setup_command(c, ['sdist'], path)
+ print("Build source dist at %s done" % path)
+
+
+def get_src_version(c, key, path):
+
+ print("get src version for %s at %s" % (key,path))
+
+ env = c.env
+
+ mod_name = env.repos[key].get('module', key) or key
+
+ f = None
+ sys.path.append(path)
+ current_path = os.getcwd()
+ os.chdir(path)
+ try:
+ f, pathname, description = imp.find_module(mod_name, [path])
+ src_mod = imp.load_module(mod_name, f, pathname, description)
+ except:
+ src_mod = None
+ print("Could not import module, trying to parse")
+ finally:
+ os.chdir(current_path)
+ if f:
+ f.close()
+ version = None
+ if src_mod is None:
+ with open(os.path.join(path,mod_name,"__init__.py"),'r') as init_file:
+ for line in init_file:
+ m = re.search('VERSION\s+=\s+\((.+)\)', line, re.I)
+ if m:
+ version = tuple([re.sub('[\s\"\']','', item) for item in m.group(1).split(',')])
+ break
+ elif hasattr(src_mod, "VERSION"):
+ version = src_mod.VERSION
+ elif hasattr(src_mod, "__version__"):
+ version = src_mod.__version__
+
+ print("VERSION : %s" % repr(version))
+
+ if version is None:
+ version = ""
+
+ if not isinstance(version, str):
+ if src_mod and hasattr(src_mod, "get_version"):
+ version_str = src_mod.get_version()
+ elif isinstance(version, tuple):
+ #convert num
+ version_str = get_version([int(s) if s.isdigit() else s for s in version])
+ else:
+ version_str = str(version)
+ else:
+ version_str = version
+
+ print("VERSION str : %s" % repr(version_str))
+ return (version, version_str)
+
+
+def sync_build(c, path):
+ print("Sync build %s" % path)
+ env = c.env
+ with Connection(env['hosts'][0]) as host_connection:
+ with host_connection.cd(env.remote_path['build_export']):
+ filename = os.path.basename(path)
+ res_trans = host_connection.put(path, os.path.join(env.remote_path['build_export'], filename))
+ print("Sync build %s to %s" % (path,res_trans.remote))
+ return res_trans
+
+
+def collectstatic(c, remotepath, remotevirtualenvpath, platform_web_module, module_settings="", admin_cmd="python manage.py"):
+ print("Collect static in %s with %s" % (remotepath, remotevirtualenvpath))
+ remotestaticsitepath = get_remote_env(c, remotepath, remotevirtualenvpath, platform_web_module, "STATIC_ROOT", c.env.settings)
+ activate_path = os.path.join(remotevirtualenvpath, "bin/activate")
+ with Connection(c.env['hosts'][0]) as rconnection:
+ with rconnection.prefix("source \"%s\"" % activate_path), rconnection.prefix("export PYTHONPATH=\"%s\"" % remotepath), rconnection.cd(remotepath):
+ #remove old files optio -c of collect static fail !
+ rconnection.run("rm -fr \"%s\"/*" % (remotestaticsitepath))
+ rconnection.run("%s collectstatic --noinput %s" % (admin_cmd, "--settings="+module_settings if module_settings else ""))
+
+
+def migrate(c, remotepath, remotevirtualenvpath, module_settings="", admin_cmd="python manage.py"):
+ activate_path = os.path.join(remotevirtualenvpath, "bin/activate")
+ with Connection(c.env['hosts'][0]) as rconnection:
+ with rconnection.prefix("source \"%s\"" % activate_path), rconnection.prefix("export PYTHONPATH=\"%s\"" % remotepath), rconnection.cd(remotepath):
+ rconnection.run("%s migrate --noinput %s" % (admin_cmd, "--settings="+module_settings if module_settings else ""))
+
+
+def export_version(c, **kwargs):
+ print("export version %s" % (repr(kwargs)))
+
+ export_path = kwargs.get('path', None)
+
+ if not export_path:
+ export_path = get_export_path(c.env, "_".join(["%s_%s" % (k,v) for k,v in kwargs.items()]))
+
+ clean_export_folder(export_path)
+
+ do_export_version(c, export_path,**kwargs)
+
+ return export_path
+
+def do_create_virtualenv(c, remote_venv_export_path, remotevirtualenvpath):
+ print("Create virtualenv export_path : %s - remote venvpath : %s" % (remote_venv_export_path, remotevirtualenvpath))
+ env = c.env
+ activate_path = os.path.join(remotevirtualenvpath, "bin/activate")
+ if env.get('remote_baseline_venv'):
+ prefix_str = "source \"%s\"" % os.path.join(env.get('remote_baseline_venv'), "bin/activate")
+ else:
+ prefix_str = "echo"
+ with Connection(env['hosts'][0]) as rconnection:
+ rconnection.run("rm -fr \"%s\"" % remotevirtualenvpath, warn=True)
+ run("mkdir -p \"%s\"" % remotevirtualenvpath)
+ with rconnection.prefix(prefix_str), rconnection.cd(os.path.join(remote_venv_export_path,"virtualenv","web")):
+ rconnection.run("python create_python_env.py")
+ rconnection.run("python project-boot.py \"%s\"" % remotevirtualenvpath)
+ with rconnection.prefix("source \"%s\"" % activate_path):
+ rconnection.run("pip install --no-cache-dir -r \"%s\"" % os.path.join(remote_venv_export_path,"virtualenv","web","res","srvr_requirements.txt"))
+
+def do_create_virtualenv_requirement(c, remote_venv_requirement_path, remotevirtualenvpath, python_version = "2"):
+ print("Create virtualenv export_path : %s - remote venvpath : %s" % (remote_venv_requirement_path, remotevirtualenvpath))
+ env = c.env
+ with Connection(env['hosts'][0]) as rconnection:
+ rconnection.run("rm -fr \"%s\"" % remotevirtualenvpath, warn=True)
+ rconnection.run("mkdir -p \"%s\"" % remotevirtualenvpath)
+ # rconnection.run("virtualenv -p `which python%s` %s" % (python_version, remotevirtualenvpath))
+ rconnection.run("python%s -m venv %s" % (python_version, remotevirtualenvpath))
+ with rconnection.prefix("echo $SHELL && . \"%s\"" % os.path.join(remotevirtualenvpath, "bin/activate")):
+ rconnection.run("pip install -r \"%s\"" % remote_venv_requirement_path)
+
+
+def do_relaunch_server(c, do_collectstatic, do_migrate):
+ env = c.env
+
+ if do_migrate:
+ migrate(c, env.remote_path['src'], env.remote_path['virtualenv'], env.get('settings', ''), env.get('admin_cmd', 'python manage.py'))
+ if do_collectstatic:
+ collectstatic(c, env.remote_path['src'], env.remote_path['virtualenv'], env.platform_web_module, env.get('settings', ''), env.get('admin_cmd', 'python manage.py'))
+
+ with Connection(env['hosts'][0]) as rconnection:
+ rconnection.sudo(env.web_relaunch_cmd, shell=False)
+
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/sbin/sync/fabfile.py Wed Jun 27 16:00:29 2018 +0200
@@ -0,0 +1,90 @@
+import imp
+import os.path
+import io
+
+# import config # @UnusedImport
+# from fablib import (export_version, do_sync_web, create_config,
+# clean_export_folder, do_sync_comp, sync_install_build, do_create_virtualenv,
+# clean_rsync_folder, rsync_export, get_src_version, sync_build,
+# do_relaunch_server, install_build, do_create_virtualenv_requirement, build_src)
+from core import (export_version, build_src, get_src_version, sync_build,
+ do_create_virtualenv_requirement, get_src_dependencies,
+ do_relaunch_server, clean_export_folder)
+# from fabric import task, env, run, cd, put
+from fabric import Connection
+from invoke import task
+from blessings import Terminal
+# from fabric.colors import green
+
+# env.use_ssh_config = True
+
+t = Terminal()
+
+def build_source(c, key, version):
+ print(t.green("build source with version %s" % version))
+ export_path = export_version(c, **{ key: version })
+ export_path_full = os.path.join(export_path, key, c.env.repos[key]['src_root'])
+ build_src(c, export_path_full)
+ (_,version_str) = get_src_version(c, key, export_path_full)
+ src_dep = get_src_dependencies(c, key, export_path_full)
+ return (os.path.join(export_path_full,"dist","%s-%s.tar.gz" % (key,version_str)), src_dep)
+
+
+def do_create_virtualenv(c, remote_build_path, dep_remote_build_path_list):
+ env = c.env
+ requirements_path = os.path.join(remote_build_path, env['repos'][env.key]['requirements'])
+ remotevirtualenvpath = env['remote_path']['virtualenv']
+ do_create_virtualenv_requirement(c, requirements_path, remotevirtualenvpath, env['repos'][env.key]['python_version'])
+ # add setting path to virtualenv
+ ext_path = "import sys; sys.__plen = len(sys.path)\n"
+ for l in env['remote_path'].get('pythonpath', []):
+ ext_path += l + "\n"
+ ext_path += "import sys; new=sys.path[sys.__plen:]; del sys.path[sys.__plen:]; p=getattr(sys,'__egginsert',0); sys.path[p:p]=new; sys.__egginsert = p+len(new)"
+ with Connection(env['hosts'][0]) as rconnection:
+ rconnection.put(io.StringIO(ext_path), os.path.join(env['remote_path']['virtualenv'], 'lib/python%s/site-packages/_virtualenv_path_extensions.pth'%env['repos'][env.key]['python_version']))
+ for dep_remote_build_path in dep_remote_build_path_list:
+ with rconnection.prefix("echo $SHELL && . \"%s\"" % os.path.join(remotevirtualenvpath, "bin/activate")):
+ rconnection.run("pip install \"%s\"" % dep_remote_build_path)
+ with rconnection.prefix("echo $SHELL && . \"%s\"" % os.path.join(remotevirtualenvpath, "bin/activate")):
+ rconnection.run("pip install \"%s.tar.gz\"" % remote_build_path)
+
+
+@task
+def relaunch_server(c, collectstatic=True, migrate=True):
+ print("Relaunch server")
+ do_relaunch_server(c, collectstatic, migrate)
+
+
+@task
+def create_virtualenv(c, version):
+
+ print(t.green("create virtualenv for version ") + version)
+ build_path, source_dep_list = build_source(c, c.env.key, version)
+ print(t.green("BUILD PATH: ") + build_path + " - %r" % source_dep_list)
+
+ source_dep_build_path_list = []
+ print("Build dependencies : %r" % source_dep_list)
+ for source_dep_key, source_dep_version in source_dep_list:
+ source_dep_build_path, _ = build_source(c, source_dep_key, source_dep_version)
+ source_dep_build_path_list.append(source_dep_build_path)
+
+ host_connection = Connection(c.env['hosts'][0])
+ host_connection.run('mkdir -p "%s"' % c.env['remote_path']['build_export'])
+
+ res_trans = sync_build(c, build_path)
+ res_trans_dep = [ sync_build(c, source_dep_build_path).remote for source_dep_build_path in source_dep_build_path_list]
+ # untar build
+ print("Untar %s on remote host"%res_trans.remote)
+ with host_connection.cd(c.env['remote_path']['build_export']):
+ host_connection.run('tar zxf %s' % res_trans.remote)
+
+ do_create_virtualenv(c, res_trans.remote[0:-7], res_trans_dep)
+
+ host_connection.run('rm -fr "%s/*"' % (c.env['remote_path']['build_export']))
+ clean_export_folder(c.env.remote_path['build_export'])
+
+
+@task
+def publish_version(c, version):
+ create_virtualenv(c, version)
+ relaunch_server(c, True, True)
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/sbin/sync/fabric.py.tmpl Wed Jun 27 16:00:29 2018 +0200
@@ -0,0 +1,75 @@
+from fabric.api import env
+from random import choice
+
+env.hosts = ['<user>@<server>']
+
+env.web_group = 'iri'
+env.folders = ['log', 'static/media']
+
+env.repos = {'iconolab' : {'repo':"<path to repo>", 'src_root':'src', 'requirements': 'requirements/prod.txt', 'python_version': '3.5'}}
+env.base_export_path = "/tmp"
+env.export_prefix = "iconolab"
+env.key = 'iconolab'
+
+env.remote_path = {
+ 'web':"/etc/www/iconolab/",
+ 'src':"/etc/www/iconolab",
+ 'virtualenv':"/srv/virtualenv/iconolab",
+ 'build_export':"/tmp/build",
+ 'pythonpath' : ['/etc/www/iconolab']
+}
+
+#env.remote_path = {
+# 'web':"/var/www/iconolab/",
+# 'src':"/Users/ymh/dev/tmp/testfab/src",
+# 'virtualenv':"/Users/ymh/dev/tmp/testiconolab/virtualenv/iconolab",
+# 'build_export':"/tmp/build",
+# 'pythonpath' : ['/etc/www/iconolab']
+#}
+
+
+env.platform_web_module = "iconolab"
+env.remote_baseline_venv = ""
+
+env.rsync_filters = {
+ 'src' : [
+ "P .htpasswd",
+ "P .htaccess",
+ "P egonomy/config.py",
+ ],
+ 'web': [
+ "+ core",
+ "P .htpasswd",
+ "P .htaccess",
+ "P robots.txt",
+ "P env/***",
+ "P log/***",
+ "P index/***",
+ "P static/media/***",
+ "P crossdomain.xml",
+ ],
+ 'venv': [
+ "+ core",
+ ]
+}
+env.web_relaunch_cmd = "supervisorctl restart iconolab"
+env.settings = "iconolab_settings"
+env.admin_cmd = "django-admin"
+env.check_folder_access = False
+
+env.config = {
+ 'web': {
+ 'base_url': "/",
+ 'web_url': 'http://egonomy.iri-resesarch.org',
+ 'db_engine':'postgresql_psycopg2',
+ 'db_name':'platform',
+ 'db_user': 'iriuser',
+ 'db_password': '',
+ 'db_host': 'sql.iri.centrepompidou.fr',
+ 'db_port': 5432,
+ 'haystack_url' : 'http://localhost:9200',
+ 'haystack_index' : 'egonomy',
+ 'log_file' : env.remote_path['web'] + '/log/log.txt',
+ 'secret_key' : ''.join([choice('abcdefghijklmnopqrstuvwxyz0123456789!@#$%^&*(-_=+)') for i in range(50)]),
+ },
+}
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/sbin/sync/requirements.txt Wed Jun 27 16:00:29 2018 +0200
@@ -0,0 +1,2 @@
+fabric
+requirements-parser
--- a/src/iconolab_episteme/management/commands/importcollection.py Tue Jun 26 15:55:08 2018 +0200
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,143 +0,0 @@
-# -*- coding: UTF-8 -*-
-import json
-import logging
-import os
-import pprint
-import re
-import shutil
-import imghdr
-
-from django.conf import settings
-from django.core.management.base import BaseCommand, CommandError
-from PIL import Image as ImagePIL
-from sorl.thumbnail import get_thumbnail
-
-from iconolab.management.commands.importimages import BaseImportImagesCommand
-from iconolab.models import (Collection, Folder, Image, ImageStats, Item,
- ItemMetadata, MetaCategory)
-
-if settings.IMPORT_LOGGER_NAME and settings.LOGGING['loggers'].get(settings.IMPORT_LOGGER_NAME, ''):
- logger = logging.getLogger(settings.IMPORT_LOGGER_NAME)
-else:
- logger = logging.getLogger(__name__)
-
-class Command(BaseImportImagesCommand):
- help = 'import collection image and file from a directory'
-
- def add_arguments(self, parser):
- parser.add_argument('source_dir')
- parser.add_argument(
- '--encoding',
- dest='encoding',
- default='utf-8',
- help='JSON file encoding'
-
- )
- parser.add_argument(
- '--collection-json',
- dest='collection_json',
- default=False,
- help='creates a new collection from a json file, must be an object with fields : '+
- '"name" (identifier), '+
- '"verbose_name" (proper title name), '+
- '"description" (description on homepage, html is supported), '+
- '"image" (image on homepages, must be "uploads/<imgname>"), '+
- '"height" and "width" (height and width of the image)',
- )
- parser.add_argument(
- '--collection-id',
- dest='collection_id',
- default=False,
- help='insert extracted data into the specified collection instead of trying to load a collection fixture',
- )
- parser.add_argument(
- '--no-jpg-conversion',
- dest='no-jpg-conversion',
- default=False,
- help='use this option if you only want the image copied and not converted'
- )
-
- def handle(self, *args, **options):
-
- print('# Logging with logger '+logger.name)
- logger.debug('# Initializing command with args: %r', options)
-
- '''Check we have a collection to store data into'''
- self.source_dir = options.get('source_dir')
- print('# Checking collection args')
- if options.get('collection_json'):
- print('## Finding collection json data in '+self.source_dir)
- collection_json_path = os.path.join(
- self.source_dir, options.get('collection_json'))
- if not os.path.isfile(collection_json_path):
- print('### No '+options.get('collection_json') +
- '.json file was found in the source directory')
- raise ValueError(
- '!!! Json file '+collection_json_path+' was not found !!!')
- try:
- with open(collection_json_path) as json_fixture_file:
- collection_data = json.loads(json_fixture_file.read())
- for key in ['name', 'verbose_name', 'description', 'image', 'height', 'width']:
- if not key in collection_data.keys():
- print('!!! Json file '+collection_json_path +
- ' has no '+key+' field !!!')
- raise ValueError()
- if not collection_data.get('name', ''):
- print('!!! Collection data key "name" is empty')
- raise ValueError()
- if Collection.objects.filter(name=collection_data.get('name')).exists():
- print(
- '!!! A Collection with the provided name already exists!')
- raise ValueError()
- if collection_data.get('image', '') and not (collection_data.get('width', 0) and collection_data.get('height', 0)):
- print(
- '!!! Collection data has an image but no height and width')
- raise ValueError()
- except ValueError as e:
- raise ValueError('!!! JSON Data is invalid. !!!')
- elif options.get('collection_id'):
- print('## Finding collection with id ' +
- options.get('collection_id'))
- try:
- collection = Collection.objects.get(
- pk=options.get('collection_id'))
- except Collection.DoesNotExist:
- raise ValueError('!!! Collection with primary key ' +
- options.get('collection_id')+' was not found, aborting !!!')
- else:
- raise ValueError(
- '!!! No collection fixture or collection id, aborting because we can\'t properly generate data. !!!')
-
-
- '''Import image collection in target directory'''
-
- if options.get('collection_json'):
- print('## Loading collection json')
- collection = Collection.objects.create(
- name=collection_data.get('name'),
- verbose_name=collection_data.get('verbose_name', ''),
- description=collection_data.get('description', ''),
- image=collection_data.get('image', ''),
- height=collection_data.get('height', 0),
- width=collection_data.get('width', 0),
- )
-
- if collection.image:
- collection_image_path = os.path.join(
- settings.MEDIA_ROOT, str(collection.image))
- if not os.path.isfile(collection_image_path):
- print('### Moving collection image')
- _, collection_image_name = os.path.split(
- collection_image_path)
- try:
- col_im = ImagePIL.open(os.path.join(
- self.source_dir, collection_image_name))
- print('##### Generating or copying jpeg for ' +
- collection_image_name)
- col_im.thumbnail(col_im.size)
- col_im.save(collection_image_path, 'JPEG', quality=options.get(
- 'jpeg_quality', settings.IMG_JPG_DEFAULT_QUALITY))
- except Exception as e:
- print(e)
-
-
--- a/src/iconolab_episteme/management/commands/importmetacategories.py Tue Jun 26 15:55:08 2018 +0200
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,90 +0,0 @@
-# -*- coding: UTF-8 -*-
-import json
-import logging
-import os
-import pprint
-import re
-import shutil
-
-from django.conf import settings
-from django.core.management.base import BaseCommand, CommandError
-
-from iconolab.management.commands.importimages import BaseImportImagesCommand
-from iconolab.models import (Collection, Folder, Image, ImageStats, Item,
- ItemMetadata, MetaCategory)
-
-if settings.IMPORT_LOGGER_NAME and settings.LOGGING['loggers'].get(settings.IMPORT_LOGGER_NAME, ''):
- logger = logging.getLogger(settings.IMPORT_LOGGER_NAME)
-else:
- logger = logging.getLogger(__name__)
-
-class Command(BaseImportImagesCommand):
- help = 'import metacategories files from a directory'
-
- def add_arguments(self, parser):
- parser.add_argument('source_dir')
- parser.add_argument(
- '--encoding',
- dest='encoding',
- default='utf-8',
- help='JSON file encoding'
-
- )
- parser.add_argument(
- '--collection-id',
- dest='collection_id',
- default=False,
- help='insert extracted data into the specified collection instead of trying to load a collection fixture',
- )
- parser.add_argument(
- '--metacategories-json',
- dest='metacategories_json',
- default=False,
- help='add metacategories to the collection from a json file (json must be a list of object with "label" and "triggers_notifications" fields)',
- )
-
- def handle(self, *args, **options):
-
- print('# Logging with logger '+logger.name)
- logger.debug('# Initializing command with args: %r', options)
-
- self.source_dir = options.get('source_dir')
-
- if options.get('collection_id'):
- print('## Finding collection with id ' +
- options.get('collection_id'))
- try:
- collection = Collection.objects.get(
- pk=options.get('collection_id'))
- except Collection.DoesNotExist:
- raise ValueError('!!! Collection with primary key ' +
- options.get('collection_id')+' was not found, aborting !!!')
- else:
- raise ValueError(
- '!!! No collection fixture or collection id, aborting because we can\'t properly generate data. !!!')
-
- if options.get('metacategories_json'):
- print('## Finding metacategories fixture json data in '+self.source_dir)
- metacategories_json_path = os.path.join(
- self.source_dir, options.get('metacategories_json'))
- if not os.path.isfile(metacategories_json_path):
- print('### No '+options.get('metacategories_json')+
- '.json file was found in the source directory')
- raise ValueError(
- '!!! Fixture file '+metacategories_json_path+' was not found !!!')
- with open(metacategories_json_path) as metacategories_json_file:
- metacategories_data = json.loads(
- metacategories_json_file.read())
- for metacategory in metacategories_data:
- if metacategory.get('label', None) is None:
- raise ValueError(
- '!!! Metacategory without label !!!')
-
- if options.get('metacategories_json'):
- for metacategory in metacategories_data:
- MetaCategory.objects.create(
- collection=collection,
- label=metacategory.get('label'),
- triggers_notifications=metacategory.get(
- 'triggers_notifications', 0)
- )
\ No newline at end of file
--- a/src/iconolab_episteme/settings/dev.py Tue Jun 26 15:55:08 2018 +0200
+++ /dev/null Thu Jan 01 00:00:00 1970 +0000
@@ -1,233 +0,0 @@
-"""
-Django settings for iconolab-episteme project.
-
-Generated by 'django-admin startproject' using Django 1.9.5.
-
-For more information on this file, see
-https://docs.djangoproject.com/en/1.9/topics/settings/
-
-For the full list of settings and their values, see
-https://docs.djangoproject.com/en/1.9/ref/settings/
-"""
-import logging
-import os
-
-from iconolab_episteme.settings import *
-
-CONTACT_EMAIL = 'youremail@yourprovider.fr'
-
-# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
-# BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
-
-# STATIC_ROOT = os.path.abspath(os.path.join(BASE_DIR, '../../../run/web/static/site'))
-# MEDIA_ROOT = os.path.abspath(os.path.join(BASE_DIR, '../../../run/web'))
-BASE_DIR = '/mnt/c/Users/Riwad/Desktop/IRI/ICONOLAB/iconolab-episteme/src/iconolab_episteme'
-
-STATIC_ROOT = '/mnt/c/Users/Riwad/Desktop/IRI/ICONOLAB/iconolab-episteme/run/web/static/site'
-MEDIA_ROOT = '/mnt/c/Users/Riwad/Desktop/IRI/ICONOLAB/iconolab-episteme/run/web/media'
-
-# dev_mode useful for src_js
-# We need to add 'iconolab.utils.context_processors.env' to context processor
-
-# When JS_DEV_MODE is True, the Webpack dev server should be started
-JS_DEV_MODE = False
-# STATICFILES_DIRS = [
-# os.path.join(BASE_DIR, 'static'),
-# os.path.join(BASE_DIR, 'media'),
-# ]
-
-if JS_DEV_MODE:
- SRC_JS_PATH = os.path.join(BASE_DIR, '..', '..', 'src_js')
- STATICFILES_DIRS.append(SRC_JS_PATH)
-
-
-BASE_URL = 'http://localhost:8000'
-if JS_DEV_MODE:
- STATIC_URL = 'http://localhost:8001/static/'
-else:
- STATIC_URL = '/static/'
-MEDIA_URL = '/media/'
-
-LOGIN_URL = '/account/login/'
-
-# Quick-start development settings - unsuitable for production
-# See https://docs.djangoproject.com/en/1.9/howto/deployment/checklist/
-
-# SECURITY WARNING: keep the secret key used in production secret!
-SECRET_KEY = '#8)+upuo3vc7fi15czxz53ml7*(1__q8hg=m&+9ylq&st1_kqv'
-
-# SECURITY WARNING: don't run with debug turned on in production!
-DEBUG = True
-THUMBNAIL_DEBUG = True
-
-ALLOWED_HOSTS = []
-
-
-# Application definition
-
-
-COMMENTS_APP = "django_comments_xtd"
-COMMENTS_XTD_MODEL = "iconolab.models.IconolabComment"
-COMMENTS_XTD_FORM_CLASS = 'iconolab.forms.comments.IconolabCommentForm'
-COMMENTS_XTD_MAX_THREAD_LEVEL = 1
-COMMENTS_PER_PAGE_DEFAULT = 10
-
-SITE_ID = 1
-
-TEMPLATES = [
- {
- 'BACKEND': 'django.template.backends.django.DjangoTemplates',
- 'DIRS': [],
- 'APP_DIRS': True,
- 'OPTIONS': {
- 'context_processors': [
- 'django.template.context_processors.debug',
- 'django.template.context_processors.request',
- 'django.contrib.auth.context_processors.auth',
- 'django.contrib.messages.context_processors.messages',
- 'django.template.context_processors.media',
- 'django.template.context_processors.static',
- 'django.template.context_processors.i18n',
- 'iconolab.utils.context_processors.env',
- ],
- # 'libraries': {
- # 'iconolab_episteme_tags':'iconolab_episteme.templatetags.iconolab_episteme_tags'
- # }
- },
- },
-]
-
-WSGI_APPLICATION = 'iconolab_episteme.wsgi.application'
-
-
-# Database
-# https://docs.djangoproject.com/en/1.9/ref/settings/#databases
-
-DATABASES = {
- 'default': {
- 'ENGINE': 'django.db.backends.postgresql_psycopg2', # Add 'postgresql_psycopg2', 'postgresql', 'mysql', 'sqlite3' or 'oracle'.
- 'NAME': 'iconolabepisteme', # Or path to database file if using sqlite3.
- 'USER': 'iri', # Not used with sqlite3.
- 'PASSWORD': 'iri', # Not used with sqlite3.
- 'HOST': '192.168.99.100', # Set to empty string for localhost. Not used with sqlite3.
- 'PORT': '5432', # Set to empty string for default. Not used with sqlite3.
- }
-}
-
-# Logging
-
-LOG_FILE = os.path.abspath(os.path.join(BASE_DIR,"../../run/log/log.txt"))
-IMPORT_LOG_FILE = os.path.abspath(os.path.join(BASE_DIR,"../../run/log/import_log.txt"))
-IMPORT_LOGGER_NAME = "import_command"
-LOG_LEVEL = logging.DEBUG
-LOGGING = {
- 'version': 1,
- 'disable_existing_loggers': True,
- 'filters': {
- 'require_debug_false': {
- '()': 'django.utils.log.RequireDebugFalse'
- }
- },
- 'formatters' : {
- 'simple' : {
- 'format': "%(asctime)s - %(levelname)s : %(message)s",
- },
- 'semi-verbose': {
- 'format': '%(levelname)s %(asctime)s %(module)s %(message)s'
- },
- },
- 'handlers': {
- 'mail_admins': {
- 'level': 'ERROR',
- 'filters': ['require_debug_false'],
- 'class': 'django.utils.log.AdminEmailHandler'
- },
- 'stream_to_console': {
- 'level': LOG_LEVEL,
- 'class': 'logging.StreamHandler'
- },
- 'file': {
- 'level': LOG_LEVEL,
- 'class': 'logging.FileHandler',
- 'filename': LOG_FILE,
- 'formatter': 'semi-verbose',
- },
- 'import_file': {
- 'level': LOG_LEVEL,
- 'class': 'logging.FileHandler',
- 'filename': IMPORT_LOG_FILE,
- 'formatter': 'semi-verbose',
- }
- },
- 'loggers': {
- 'django.request': {
- 'handlers': ['file'],
- 'level': LOG_LEVEL,
- 'propagate': True,
- },
- 'iconolab': {
- 'handlers': ['file'],
- 'level': LOG_LEVEL,
- 'propagate': True,
- },
- 'import_command': {
- 'handlers': ['import_file'],
- 'level': LOG_LEVEL,
- 'propagate': True,
- },
- }
-}
-
-ELASTICSEARCH_DSL = {
- 'default': {
- 'hosts': '192.168.99.100:9200'
- },
-}
-
-CACHES = {
- 'default': {
- 'BACKEND': 'django.core.cache.backends.filebased.FileBasedCache',
- 'LOCATION': os.path.join(MEDIA_ROOT, 'cache'),
-# 'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
-# 'LOCATION': 'unix:/var/run/memcached/memcached.socket',
-# 'KEY_PREFIX': 'ldt',
- }
-}
-# Password validation
-# https://docs.djangoproject.com/en/1.9/ref/settings/#auth-password-validators
-
-AUTH_PASSWORD_VALIDATORS = [
- {
- 'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
- },
- {
- 'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
- },
- {
- 'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
- },
- {
- 'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
- },
-]
-
-
-# Internationalization
-# https://docs.djangoproject.com/en/1.9/topics/i18n/
-
-LANGUAGE_CODE = 'en-us'
-
-TIME_ZONE = 'UTC'
-
-USE_I18N = True
-
-USE_L10N = True
-
-USE_TZ = True
-
-INTERNAL_TAGS_URL = BASE_URL
-
-ESO_NOTICE_BASE_URL = "https://www.eso.org/public/france/images/"
-
-RELEVANT_TAGS_MIN_SCORE = 3
-ACCURATE_TAGS_MIN_SCORE = 3
--- a/src/iconolab_episteme/settings/dev.py.tmpl Tue Jun 26 15:55:08 2018 +0200
+++ b/src/iconolab_episteme/settings/dev.py.tmpl Wed Jun 27 16:00:29 2018 +0200
@@ -229,7 +229,7 @@
"DENO": [],
"DOM": ["Domaine"],
"APPL": [],
- "PERI": ["Période"],
+ "PERI": ["Période"],
"MILL": [],
"TECH": [],
"DIMS": ["Dimensions"],
@@ -243,7 +243,6 @@
}
INTERNAL_TAGS_URL = BASE_URL
-JOCONDE_NOTICE_BASE_URL = "http://www.culture.gouv.fr/public/mistral/joconde_fr?ACTION=CHERCHER&FIELD_98=REF&VALUE_98="
RELEVANT_TAGS_MIN_SCORE = 3
ACCURATE_TAGS_MIN_SCORE = 3
--- a/src/iconolab_episteme/urls.py Tue Jun 26 15:55:08 2018 +0200
+++ b/src/iconolab_episteme/urls.py Wed Jun 27 16:00:29 2018 +0200
@@ -19,7 +19,6 @@
from django.contrib import admin
from django.contrib.staticfiles.urls import staticfiles_urlpatterns
from django.urls import include, path, re_path, reverse_lazy
-from iconolab_episteme import views
import iconolab.urls
--- /dev/null Thu Jan 01 00:00:00 1970 +0000
+++ b/src/setup.py Wed Jun 27 16:00:29 2018 +0200
@@ -0,0 +1,150 @@
+import os
+from setuptools import setup
+from distutils.command.install_data import install_data
+from distutils.command.install import INSTALL_SCHEMES
+import sys
+
+
+class osx_install_data(install_data):
+ # On MacOS, the platform-specific lib dir is /System/Library/Framework/Python/.../
+ # which is wrong. Python 2.5 supplied with MacOS 10.5 has an Apple-specific fix
+ # for this in distutils.command.install_data#306. It fixes install_lib but not
+ # install_data, which is why we roll our own install_data class.
+
+ def finalize_options(self):
+ # By the time finalize_options is called, install.install_lib is set to the
+ # fixed directory, so we set the installdir to install_lib. The
+ # install_data class uses ('install_data', 'install_dir') instead.
+ self.set_undefined_options('install', ('install_lib', 'install_dir'))
+ install_data.finalize_options(self)
+
+def fullsplit(path, result=None):
+ """
+ Split a pathname into components (the opposite of os.path.join) in a
+ platform-neutral way.
+ """
+ if result is None:
+ result = []
+ head, tail = os.path.split(path)
+ if head == '':
+ return [tail] + result
+ if head == path:
+ return result
+ return fullsplit(head, [tail] + result)
+
+
+def launch_setup(script_name, script_args):
+ if sys.platform == "darwin":
+ cmdclasses = {'install_data': osx_install_data}
+ else:
+ cmdclasses = {'install_data': install_data}
+
+
+ root_dir = os.path.dirname(__file__)
+ if root_dir != '':
+ os.chdir(root_dir)
+ source_dirs = ['iconolab_episteme',]
+
+ version_variables = {}
+ try:
+ with open(os.path.join(source_dirs[0], "__init__.py")) as f:
+ code = compile(f.read(), "__init__.py", 'exec')
+ exec(code, version_variables)
+ except:
+ pass
+
+ version = version_variables['__version__']
+
+ packages, data_files = [], []
+
+ for source_dir in source_dirs:
+ for dirpath, dirnames, filenames in os.walk(source_dir):
+ # Ignore dirnames that start with '.'
+ for i, dirname in enumerate(dirnames):
+ if dirname.startswith('.') or dirname.startswith('__pycache__'): del dirnames[i]
+ if '__init__.py' in filenames:
+ packages.append('.'.join(fullsplit(dirpath)))
+ elif filenames:
+ data_files.append([dirpath, [os.path.join(dirpath, f) for f in filenames]])
+
+
+ # Tell distutils to put the data_files in platform-specific installation
+ # locations. See here for an explanation:
+ # http://groups.google.com/group/comp.lang.python/browse_thread/thread/35ec7b2fed36eaec/2105ee4d9e8042cb
+ for scheme in INSTALL_SCHEMES.values():
+ scheme['data'] = scheme['purelib']
+
+ # Small hack for working with bdist_wininst.
+ # See http://mail.python.org/pipermail/distutils-sig/2004-August/004134.html
+ if len(sys.argv) > 1 and sys.argv[1] == 'bdist_wininst':
+ for file_info in data_files:
+ file_info[0] = '\\PURELIB\\%s' % file_info[0]
+
+ #write MANIFEST.in
+
+ with open("MANIFEST.in", "w") as m:
+ m.write("include CHANGES\n")
+ m.write("include LICENSE\n")
+ m.write("include README.md\n")
+ m.write("include MANIFEST.in\n")
+ m.write("include requirements/base.txt\n")
+ m.write("include requirements/dev.txt\n")
+ m.write("include requirements/prod.txt\n")
+ for entry in data_files:
+ file_list = entry[1]
+ for filename in file_list:
+ m.write("include %s\n" % (filename))
+
+ return setup(
+ script_name=script_name,
+ script_args=script_args,
+ name='iconolab_episteme',
+ version=version,
+ author='IRI',
+ author_email='contact@iri.centrepompidou.fr',
+ packages=packages,
+ data_files=data_files,
+ cmdclass=cmdclasses,
+ scripts=[],
+ url='http://www.iri.centrepompidou.fr/dev/hg/iconolab_episteme',
+ license='LICENSE',
+ description='projet Iconolab for Episteme',
+ long_description="Projet Iconolab for Episteme",
+ classifiers=[
+ 'Development Status :: 4 - Beta',
+ 'Environment :: Web Environment',
+ 'Framework :: Django',
+ 'Intended Audience :: Developers',
+ 'License :: Ceccil-B',
+ 'Operating System :: OS Independent',
+ 'Programming Language :: Python',
+ 'Topic :: Utilities'
+ ],
+ setup_requires=['setuptools_scm'],
+ install_requires=[
+ "Django >= 2.0",
+ "iconolab == 0.1.3",
+ "django-appconf",
+ "django-comments-xtd",
+ "django-contrib-comments",
+ "elasticsearch-dsl >= 6.0, < 7.0",
+ "django-elasticsearch-dsl",
+ "django-notifications-hq",
+ "elasticsearch",
+ "jsonfield",
+ "Pillow",
+ "pytz",
+ "requests",
+ "six",
+ "sorl-thumbnail >= 12.4.1",
+ "djangorestframework >= 3.8"
+ ],
+ )
+
+
+if __name__ == "__main__":
+
+ script_name = os.path.basename(sys.argv[0])
+ script_args = sys.argv[1:]
+
+ launch_setup(script_name, script_args)