Compare commits

..

1 Commits

Author SHA1 Message Date
5af5374f77 config : fallback to 22 instead of None
The configparser fallback option was None set it to use 22 instead as
None doesn't make sense
2026-01-01 17:30:28 +01:00
20 changed files with 118 additions and 647 deletions

2
.gitignore vendored
View File

@@ -1,4 +1,2 @@
poetry.lock
__pycache__
docs/build
dist/

View File

@@ -23,4 +23,5 @@ The issue is that you need to know what data is stored on the server to avoid co
# Developement
Unisync was at first a simple bash script but as it grew more complex I started struggling to maintain it which is why I am porting it to python. It will make everything more robust, easier to maintain and to add functionalities.
I am in the early stages of the developement process, this should be usable someday (hopefully).
I am in the early stages of the developement process, this should be usable in the upcoming weeks.
Help will be welcome in the future but is not desirable right now as I want to shape this the way I want to.

View File

@@ -1,20 +0,0 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = source
BUILDDIR = build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

View File

@@ -1,35 +0,0 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=source
set BUILDDIR=build
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.https://www.sphinx-doc.org/
exit /b 1
)
if "%1" == "" goto help
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
:end
popd

View File

@@ -1,36 +0,0 @@
# Configuration file for the Sphinx documentation builder.
#
# For the full list of built-in configuration values, see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
# -- Project information -----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information
project = 'unisync'
copyright = '2026, Paul Retourné'
author = 'Paul Retourné'
release = '0.1.0'
# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.viewcode',
'sphinx.ext.napoleon',
'sphinx.ext.todo'
]
templates_path = ['_templates']
exclude_patterns = []
# -- Options for HTML output -------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output
#html_theme = 'alabaster'
html_theme = 'sphinx_rtd_theme'
html_static_path = ['_static']
autodoc_docstring_signature = True

View File

@@ -1,24 +0,0 @@
.. _example_how_it_works:
Example of how unisync works
============================
Let's say you have the following structure::
$ tree .
.
├── big_file
└── folder
   ├── file
   └── other_file
If you only want to synchronise `folder` and its content on your laptop the following will be automatically generated::
$ tree .
.
├── big_file -> ../.data/big_file
└── folder
   ├── file
   └── other_file
`big_file` is now a symbolic link and by mounting the remote directory you can still seemlessly access `big_file` through the network.

View File

@@ -1,31 +0,0 @@
.. unisync documentation master file, created by
sphinx-quickstart on Sun Jan 4 15:02:58 2026.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Documentation for unisync
=========================
Unisync is a data synchronising tool built around `unison`_ and expending on it.
Unisync tries to solve two problems that are often solved separately but never together :
* Keeping your data synchronised between multiple machines (through a central server), examples of this are rsync and of course unison.
* Being able to access and edit files stored on your server without having to download them, the gui interface of nextcloud for example.
* And of course I want to be able to do all of this without ever having to leave my terminal.
Unisync solves this problem by placing each file on your local machine but with only the selected files and folders being physically present on your drive,
the others are replaced by symbolic links pointing to a directory that is mounted from your server.
See this
:ref:`example_how_it_works`.
.. _unison: https://github.com/bcpierce00/unison
.. toctree::
:maxdepth: 2
:caption: Contents:
example
modules

View File

@@ -1,7 +0,0 @@
unisync
=======
.. toctree::
:maxdepth: 4
unisync

View File

@@ -1,77 +0,0 @@
unisync package
===============
Submodules
----------
unisync.argparser module
------------------------
.. automodule:: unisync.argparser
:members:
:show-inheritance:
:undoc-members:
unisync.config module
---------------------
.. automodule:: unisync.config
:members:
:show-inheritance:
:undoc-members:
unisync.defaults module
-----------------------
.. automodule:: unisync.defaults
:members:
:show-inheritance:
:undoc-members:
unisync.errors module
---------------------
.. automodule:: unisync.errors
:members:
:show-inheritance:
:undoc-members:
unisync.main module
-------------------
.. automodule:: unisync.main
:members:
:show-inheritance:
:undoc-members:
unisync.paths module
--------------------
.. automodule:: unisync.paths
:members:
:show-inheritance:
:undoc-members:
unisync.runners module
----------------------
.. automodule:: unisync.runners
:members:
:show-inheritance:
:undoc-members:
unisync.synchroniser module
---------------------------
.. automodule:: unisync.synchroniser
:members:
:show-inheritance:
:undoc-members:
Module contents
---------------
.. automodule:: unisync
:members:
:show-inheritance:
:undoc-members:

View File

@@ -10,9 +10,6 @@ requires-python = ">=3.13"
dependencies = [
]
[project.scripts]
unisync = "unisync.main:main"
[tool.poetry]
packages = [{include = "unisync", from = "src"}]
@@ -20,12 +17,3 @@ packages = [{include = "unisync", from = "src"}]
[build-system]
requires = ["poetry-core>=2.0.0,<3.0.0"]
build-backend = "poetry.core.masonry.api"
[dependency-groups]
docs = [
"sphinx (>=9.1.0,<10.0.0)",
"sphinx-rtd-theme (>=3.0.2,<4.0.0)",
]
dev = [
"pylint (>=4.0.4,<5.0.0)"
]

View File

@@ -1,36 +1,22 @@
# Copyright (C) 2025-2026 Paul Retourné
# Copyright (C) 2025 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
import argparse
def create_argparser(sync_function, add_function, mount_function) -> argparse.ArgumentParser:
"""
Creates an argument parser to parse the command line arguments.
We use subparsers and set a default function for each to perform the correct action.
"""
def create_argparser() -> argparse.ArgumentParser:
parser = argparse.ArgumentParser(
prog='unisync',
description='File synchronisation application',
epilog="Copyright © 2025-2026 Paul Retourné.\n"
"License GPLv3+: GNU GPL version 3 or later <https://gnu.org/licenses/gpl.html>.",
formatter_class=argparse.RawDescriptionHelpFormatter
epilog="""
Copyright © 2025 Paul Retourné.
License GPLv3+: GNU GPL version 3 or later <https://gnu.org/licenses/gpl.html>."""
)
parser.add_argument("local", nargs="?")
parser.add_argument("remote", nargs="?")
parser.set_defaults(func=sync_function)
remote_addr_group = parser.add_mutually_exclusive_group()
remote_addr_group.add_argument("--ip")
remote_addr_group.add_argument("--hostname")
parser.add_argument("--config", help="Path to the configuration file", metavar="path_to_config")
subparsers = parser.add_subparsers(help='Actions other than synchronisation')
parser_add = subparsers.add_parser('add', help='Add files to be synchronised.')
parser_add.set_defaults(func=add_function)
parser_mount = subparsers.add_parser('mount', help='Mount the remote.')
parser_mount.set_defaults(func=mount_function)
return parser

View File

@@ -1,24 +1,22 @@
# Copyright (C) 2025-2026 Paul Retourné
# Copyright (C) 2025 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
from configparser import UNNAMED_SECTION
from dataclasses import dataclass
from dataclasses import dataclass, field
from pathlib import Path
import ipaddress
import configparser
from unisync.defaults import *
@dataclass
class ServerConfig:
"""
Dataclass keeping the config for connecting to the server
"""
user: str
sshargs: str
hostname: str
ip: str
port: int
sshargs: str = ""
hostname: str = ""
ip: str = ""
port: int = 22
def __post_init__(self):
"""
@@ -30,8 +28,8 @@ class ServerConfig:
if self.ip != "":
try:
ipaddress.ip_address(self.ip)
except ValueError as e:
raise ValueError("The provided ip address is invalid") from e
except ValueError:
raise ValueError("The provided ip address is invalid")
@dataclass
class RootsConfig:
@@ -46,27 +44,15 @@ class UnisonConfig:
"""
Dataclass keeping unison specific configurations
"""
bools: list
values: dict
@dataclass
class BackupConfig:
"""
Configuration options relative to backing up the files.
"""
enabled: bool
selection: str
location: str
max_backups: int
backupsuffix: str
backupprefix: str
bools: list = field(default_factory=list)
values: dict = field(default_factory=dict)
@dataclass
class OtherConfig:
"""
Dataclass keeping miscellanous configuration options
"""
cache_dir_path: Path
cache_dir_path: Path = Path("~/.unisync").expanduser()
@dataclass
class Config:
@@ -76,8 +62,7 @@ class Config:
server: ServerConfig
roots: RootsConfig
unison: UnisonConfig
backup: BackupConfig
other: OtherConfig
other: OtherConfig = field(default_factory=OtherConfig)
def load_config(config_path:str) -> Config:
@@ -94,42 +79,29 @@ def load_config(config_path:str) -> Config:
# Check if sections are provided
server_section = "Server" if "Server" in config.sections() else UNNAMED_SECTION
roots_section = "Roots" if "Roots" in config.sections() else UNNAMED_SECTION
backup_section = "Backup"
other_section = "Other" if "Other" in config.sections() else UNNAMED_SECTION
server_config = ServerConfig(
config.get(server_section, "user"),
config.get(server_section, "sshargs", fallback=DEFAULT_SERVER_SSHARGS),
config.get(server_section, "hostname", fallback=DEFAULT_SERVER_HOSTNAME),
config.get(server_section, "ip", fallback=DEFAULT_SERVER_IP),
config.getint(server_section, "port", fallback=DEFAULT_SERVER_PORT)
config.get(server_section, "sshargs", fallback=""),
config.get(server_section, "hostname", fallback=""),
config.get(server_section, "ip", fallback=""),
config.getint(server_section, "port", fallback=22)
)
roots_config = RootsConfig(
config.get(roots_section, "local", fallback=DEFAULT_ROOTS_LOCAL),
config.get(roots_section, "local"),
config.get(roots_section, "remote")
)
backup_config = BackupConfig(
config.getboolean(backup_section, "enabled", fallback=DEFAULT_BACKUP_ENABLED),
config.get(backup_section, "selection", fallback=DEFAULT_BACKUP_SELECTION),
config.get(backup_section, "loction", fallback=DEFAULT_BACKUP_LOC),
config.getint(backup_section, "max_backups", fallback=DEFAULT_BACKUP_MAX_BACKUPS),
config.get(backup_section, "backupsuffix", fallback=DEFAULT_BACKUP_BACKUPSUFFIX),
config.get(backup_section, "backupprefix", fallback=DEFAULT_BACKUP_BACKUPPREFIX)
)
other_config = OtherConfig(
Path(config.get(other_section, "cache_dir_path", fallback=DEFAULT_MISC_CACHE_DIR_PATH)).expanduser()
)
args_bool = []
args_val = {}
args_bool = list()
args_val = dict()
if "Unison" in config.sections():
for key, val in config.items("Unison"):
if key in config["DEFAULT"].keys():
continue
if val in ("", None):
elif val == "" or val == None:
args_bool.append(key)
else:
args_val[key] = val
unison_config = UnisonConfig(args_bool, args_val)
return Config(server_config, roots_config, unison_config, backup_config, other_config)
return Config(server_config, roots_config, unison_config)

View File

@@ -1,25 +0,0 @@
# Copyright (c) 2026 paul retourné
# spdx-license-identifier: gpl-3.0-or-later
from pathlib import Path
# Commented out values are part of the config but are required so there is no defaults.
# This allows this file to be a list of all the config options.
# DEFAULT_SERVER_USER: str = ""
DEFAULT_SERVER_SSHARGS: str = ""
DEFAULT_SERVER_HOSTNAME: str = ""
DEFAULT_SERVER_IP: str = ""
DEFAULT_SERVER_PORT: int = 22
DEFAULT_ROOTS_LOCAL: str = str(Path("~/files").expanduser())
# DEFAULT_ROOTS_REMOTE: str = ""
DEFAULT_MISC_CACHE_DIR_PATH: str = "~/.unisync"
DEFAULT_BACKUP_ENABLED: bool = False
DEFAULT_BACKUP_SELECTION: str = ""
DEFAULT_BACKUP_LOC: str = "local"
DEFAULT_BACKUP_MAX_BACKUPS: int = 2
DEFAULT_BACKUP_BACKUPSUFFIX: str = ".$VERSION.bak"
DEFAULT_BACKUP_BACKUPPREFIX: str = ".unison_backups/"

View File

@@ -1,21 +1,5 @@
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
from typing import NoReturn
import sys
class RemoteMountedError(Exception):
class RemoteMountedError(BaseException):
pass
class InvalidMountError(Exception):
class InvalidMountError(BaseException):
pass
class UnknownSSHError(Exception):
pass
class FatalSyncError(Exception):
pass
def unisync_exit_fatal(reason:str) -> NoReturn:
print(reason)
sys.exit(1)

View File

@@ -1,30 +1,25 @@
# Copyright (C) 2025-2026 Paul Retourné
# Copyright (C) 2025 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
from pathlib import Path
from unisync.argparser import create_argparser
from unisync.errors import UnknownSSHError, unisync_exit_fatal
from unisync.runners import unisync_sync, unisync_add, unisync_mount
from unisync.config import load_config
from unisync.synchroniser import Synchroniser
from unisync.paths import PathsManager
import os
from argparser import create_argparser
from config import RootsConfig, ServerConfig, Config, load_config
from synchroniser import Synchroniser
from pathlib import Path, PosixPath
from paths import *
def main():
parser = create_argparser(unisync_sync, unisync_add, unisync_mount)
cli_args = parser.parse_args()
parser = create_argparser()
base_namespace = parser.parse_args()
config_path: Path = Path("~/.config/unisync/config.ini").expanduser()
# Check if --config is set
if cli_args.config is not None and Path(cli_args.config).is_file():
config = load_config(cli_args.config)
elif config_path.is_file():
config = load_config(str(config_path))
config_path = os.path.expanduser("~/.config/unisync/config.ini")
if base_namespace.config != None and os.path.isfile(base_namespace.config):
config = load_config(base_namespace.config)
elif os.path.isfile(config_path):
config = load_config(config_path)
else:
# TODO replace the next line with something to do if no config file is found
config = load_config(str(config_path))
# TODO: make the command line arguments work and override the config options
# TODO make the command line arguments work and override the config options
pass
synchroniser = Synchroniser(
config.roots.remote,
@@ -33,16 +28,23 @@ def main():
config.server.ip if config.server.ip != "" else config.server.hostname,
config.server.port,
config.unison.bools,
config.unison.values,
backup=config.backup
config.unison.values
)
paths_manager = PathsManager(Path(config.roots.local), config.other.cache_dir_path)
try:
cli_args.func(synchroniser, paths_manager, config)
except UnknownSSHError:
unisync_exit_fatal("Connection failed quitting")
if synchroniser.create_ssh_master_connection() != 0:
print("Connection failed quitting")
return 1
print("Connected to the remote.")
#synchroniser.sync_files()
#synchroniser.update_links(background=False)
#synchroniser.mount_remote_dir()
synchroniser.close_ssh_master_connection()
print(paths_manager.get_paths_to_sync())
if __name__ == "__main__":

View File

@@ -1,4 +1,4 @@
# Copyright (C) 2025-2026 Paul Retourné
# Copyright (C) 2025 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
@@ -92,7 +92,7 @@ class PathsManager:
Writes a list of new paths to the file
"""
current_paths = self.get_paths_to_sync()
paths_to_add = []
paths_to_add = list()
# Check if one of the parent is already being synchronised
# If so there is no need to add the child path
for new_path in paths:
@@ -106,7 +106,7 @@ class PathsManager:
if not is_contained and new_path not in paths_to_add:
paths_to_add.append(new_path)
with self.paths_file.open("a") as f:
with self.paths_file.open("w") as f:
for p in paths_to_add:
f.write(p + "\n")

View File

@@ -1,43 +0,0 @@
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
from unisync.synchroniser import Synchroniser
from unisync.paths import PathsManager
from unisync.config import Config
def unisync_sync(synchroniser:Synchroniser, paths_manager:PathsManager, config: Config):
del config # The function signature must be the same for all runners
synchroniser.create_ssh_master_connection()
print("Connected to the remote.")
synchroniser.sync_files(paths_manager.get_paths_to_sync())
synchroniser.sync_links(paths_manager.get_paths_to_sync())
# TODO check the config options and do or don't do the following
synchroniser.update_links()
#synchroniser.mount_remote_dir()
synchroniser.close_ssh_master_connection()
def unisync_add(synchroniser:Synchroniser, paths_manager:PathsManager, config: Config):
del config # The function signature must be the same for all runners
synchroniser.create_ssh_master_connection()
print("Connected to the remote.")
# TODO config or cli to skip this first sync
synchroniser.sync_files(paths_manager.get_paths_to_sync())
paths_manager.add_files_to_sync()
synchroniser.sync_files(paths_manager.get_paths_to_sync(), force=True)
synchroniser.close_ssh_master_connection()
def unisync_mount(synchroniser:Synchroniser, paths_manager:PathsManager, config: Config):
del paths_manager # The function signature must be the same for all runners
del config # The function signature must be the same for all runners
synchroniser.mount_remote_dir()

View File

@@ -1,12 +1,6 @@
# Copyright (C) 2025-2026 Paul Retourné
# Copyright (C) 2025 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
"""Exports the Synchroniser class.
This class is used to perform all the actions that require a connection to
the remote.
"""
import subprocess
import os
import sys
@@ -14,47 +8,15 @@ import time
import logging
from pathlib import Path
from typing import cast
from unisync.errors import RemoteMountedError, InvalidMountError, UnknownSSHError, FatalSyncError
from unisync.config import BackupConfig
from errors import RemoteMountedError, InvalidMountError
logger = logging.getLogger(__name__)
class Synchroniser:
"""Synchroniser used to synchronise with a server.
It is used to perform every action needing a connection to the remote.
Create an ssh connection.
Perform the various synchronisation steps (files, links).
Update the links on the remote.
Mount the remote directory.
Close the ssh connection.
Attributes:
remote: The directory to synchronise to on the remote.
local: The directory to synchronise from locally.
user: The user on the remote server.
ip: The ip of the remote server.
port: The ssh port on the remote.
args_bool:
A list of boolean arguments for unison.
They will be passed directly to unison when calling it.
For example : auto will be passed as -auto
args_value:
Same as args_bool but for key value arguments.
Will be passed to unison as "-key value".
ssh_settings:
Settings to pass to the underlying ssh connection.
Currently unused.
"""
def __init__(self, remote:str, local:str, user:str, ip:str, port:int=22,
args_bool:list=[], args_value:dict={}, ssh_settings:dict={},
backup:BackupConfig | None = None
):
"""Initialises an instance of Synchroniser.
"""
def __init__(self, remote:str, local:str, user:str, ip:str,
port:int=22, args_bool:list=[], args_value:dict={}, ssh_settings:dict={}):
self.remote_dir:str = remote
self.local:str = local
self.args_bool:list[str] = args_bool
@@ -63,56 +25,16 @@ class Synchroniser:
self.remote_user:str = user
self.remote_ip:str = ip
self.remote_port:int = port
self.files_extra:list = list()
self.links_extra:list = list()
if(backup != None and backup.enabled):
backup = cast(BackupConfig, backup)
self.files_extra.append("-backup")
if(backup.selection != ""):
self.files_extra.append(backup.selection)
else:
self.files_extra.append("Name *")
self.files_extra.extend([
"-backuploc",
backup.location,
"-maxbackups",
str(backup.max_backups),
"-backupsuffix",
backup.backupsuffix,
"-backupprefix",
backup.backupprefix,
"-ignore",
f"Name {backup.backupprefix[:-1]}"
])
self.links_extra.extend([
"-ignore",
f"Name {backup.backupprefix[:-1]}"
])
def create_ssh_master_connection(self, control_path:str="~/.ssh/control_%C", connection_timeout:int=60) -> None:
"""Creates an ssh master connection.
It is used so the user only has to authenticate once to the remote server.
The subsequent connections will be made through this master connection
which speeds up connnection.
The users only have to enter their password once per synchronisation.
Args:
control_path: Set the location of the ssh control socket
connection_timeout:
Time given to the user to authenticate to the remote server.
On slow connections one might want to increase this.
Raises:
subprocess.TimeoutExpired:
The user didn't finish loging in in time.
KeyboardInterrupt:
The user interrupted the process.
UnknownSSHError:
An error occured during the connection.
def create_ssh_master_connection(self, control_path:str="~/.ssh/control_%C", connection_timeout:int=60) -> int:
"""
Creates an ssh master connection so the user only has to authenticate once to the remote server.
The subsequent connections will be made through this master connection which speeds up connecting.
@control_path: Set the location of the ssh control socket
@connection_timeout:
Time given to the user to authenticate to the remote server.
On slow connections one might want to increase this.
Returns 0 on success.
"""
self.control_path = os.path.expanduser(control_path)
command = [
@@ -124,25 +46,23 @@ class Synchroniser:
"-p", str(self.remote_port)
]
master_ssh = subprocess.Popen(command)
# TODO: Raise an exception instead of changing the return value
try:
ret_code = master_ssh.wait(timeout=connection_timeout)
except subprocess.TimeoutExpired as e:
except subprocess.TimeoutExpired:
print("Time to login expired", file=sys.stderr)
raise e
except KeyboardInterrupt as e:
raise e
return 1
except KeyboardInterrupt:
return 2
if ret_code != 0:
print("Login to remote failed", file=sys.stderr)
raise UnknownSSHError
return ret_code
return 0
def close_ssh_master_connection(self) -> int:
"""Closes the ssh master connection.
Returns:
The return code of the ssh call.
"""
Close the ssh master connection.
"""
command = [
"/usr/bin/ssh",
@@ -154,68 +74,40 @@ class Synchroniser:
close = subprocess.Popen(command)
return close.wait()
def sync_files(self, paths:list, force:bool=False) -> None:
"""Synchronises the files.
Args:
paths: List of paths to synchronise.
force: Force the changes from remote to local.
Raises:
FatalSyncError: A fatal error occured during the synchronisation.
def sync_files(self, paths:list, force:bool=False) -> int:
"""
self.sync(
Synchronises the files.
"""
return self.sync(
f"ssh://{self.remote_user}@{self.remote_ip}/{self.remote_dir}/.data",
self.local,
paths=paths,
force=force,
other=self.files_extra
force=force
)
def sync_links(self, ignore:list) -> None:
"""Synchronises the links, they must exist already.
Args:
ignore: List of paths to ignore.
Raises:
FatalSyncError: A fatal error occured during the synchronisation.
def sync_links(self, ignore:list) -> int:
"""
self.sync(
Synchronises the links, they must exist already.
"""
return self.sync(
f"ssh://{self.remote_user}@{self.remote_ip}/{self.remote_dir}/links",
self.local,
ignore=ignore,
other=self.links_extra
ignore=ignore
)
def sync(self, remote_root:str, local_root:str,
paths:list=[], ignore:list=[], force:bool=False,
other:list=[]
) -> None:
"""Performs the synchronisation by calling unison.
Args:
remote_root: The remote root, must be a full root usable by unison.
local_root: The local root, must be a full root usable by unison.
paths: List of paths to synchronise
ignore: List of paths to ignore
The paths and everything under them will be ignored.
If you need to ignore some specific files use the arguments.
force: Force all changes from remote to local.
Used mostly when replacing a link by the file.
other:
Other arguments to add to unison.
These arguments will only be used for this sync which is not
the case for the ones in self.args_bool and self.args_value.
They will be added to the command as is no - in front.
For exemple backups are implemented using this argument.
Raises:
FatalSyncError:
If unison returns 3 it means either a fatal error occured or the synchronisation
was interrupted.
If this happens propagate the error to unisync.
paths:list=[], ignore:list=[], force:bool=False) -> int:
"""
Perform the synchronisation by calling unison.
@remote_root: The remote root, must be a full root usable by unison.
@local_root: The local root, must be a full root usable by unison.
@paths: List of paths to synchronise
@ignore: List of paths to ignore
The paths and everything under them will be ignored.
If you need to ignore some specific files use the arguments.
@force: Force all changes from remote to local.
Used mostly when replacing a link by the file.
Returns: the unison return code see section 6.11 of the documentation
"""
command = [ "/usr/bin/unison", "-root", remote_root, "-root", local_root ]
for arg in self.args_bool:
@@ -225,7 +117,6 @@ class Synchroniser:
command.append(value)
sshargs = f"-p {self.remote_port} "
sshargs += f"-S {self.control_path} "
for arg, value in self.ssh_settings.items():
sshargs += arg + " " + value + " "
command.append("-sshargs")
@@ -240,26 +131,21 @@ class Synchroniser:
command.append(f"BelowPath {path}")
if force:
command.append("-prefer")
command.append("-force")
command.append(remote_root)
command.append("-batch")
for arg in other:
command.append(arg)
proc = subprocess.Popen(command)
ret_code = proc.wait()
if ret_code == 3:
raise FatalSyncError("Synchronisation could not be completed")
return ret_code
def update_links(self, background:bool=True):
"""Updates the links on the remote.
"""
Update the links on the remote.
First calls cleanlinks to remove deadlinks and empty directories.
Then calls lndir to create the new links.
Args:
background: controls if the update is done in the background or waited for.
- background: controls if the update is done in the background or waited for
"""
link_update_script = (f"cd {self.remote_dir}/links && "
@@ -279,7 +165,7 @@ class Synchroniser:
link_background_wrapper
]
link_update_process = subprocess.run(command, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
link_update_process = subprocess.Popen(command, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
if not background:
print("Starting links update.")
@@ -287,14 +173,13 @@ class Synchroniser:
print("Done")
def mount_remote_dir(self):
"""Mounts the remote directory to make the local links work.
This is achieved using sshfs which may fail.
Raises:
RemoteMountedError: The .data directory is already a mount point.
InvalidMountError: .data is either not a directory or not empty.
subprocess.CalledProcessError: An error occured with sshfs.
"""
Mount the remote directory to make the local links work.
This is achieved using sshfs.
Raise:
- RemoteMountedError: The .data directory is already a mount point
- InvalidMountError: .data is either not a directory or not empty
- subprocess.CalledProcessError: An error occured with sshfs
"""
# Get the absolute path to the correct .data directory resolving symlinks
path_to_mount:Path = Path(f"{self.local}/../.data").resolve()

View File

@@ -1,8 +0,0 @@
# Copyright (C) 2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
from unisync.synchroniser import Synchroniser
from unisync.paths import PathsManager
def unisync_test(synchroniser:Synchroniser, paths_manager:PathsManager):
print("Testing")

View File

@@ -1,39 +0,0 @@
# Copyright (C) 2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
import os
from pathlib import Path
from unisync.argparser import create_argparser
from unisync.runners import unisync_sync, unisync_add, unisync_mount
from unisync.config import load_config
from unisync.synchroniser import Synchroniser
from unisync.paths import *
from runners import *
def main():
parser = create_argparser(unisync_test, unisync_add, unisync_mount)
cli_args = parser.parse_args()
config_path = os.path.expanduser("./config.ini")
config = load_config(config_path)
print(config)
synchroniser = Synchroniser(
config.roots.remote,
config.roots.local,
config.server.user,
config.server.ip if config.server.ip != "" else config.server.hostname,
config.server.port,
config.unison.bools,
config.unison.values
)
paths_manager = PathsManager(Path(config.roots.local), config.other.cache_dir_path)
cli_args.func(synchroniser, paths_manager)
if __name__ == "__main__":
main()