Compare commits

...

62 Commits

Author SHA1 Message Date
ae0beac9e0 Catch UnknownSSHError and do not use return codes
The runners were still checking for the return codes of
create_ssh_master_connection instead of catching the exception.
We now catch the exceptions we calling the runners in main.
2026-01-31 12:30:47 +01:00
072c2a26e6 errors : Add program ending function
Add a function that quits the program using sys.exit.
This is useful when we enconter a fatal error.
2026-01-31 11:57:09 +01:00
b0c165b8b0 Revert "paths : make TimeoutExpired handling clearer"
This reverts commit 041ede22e1.

There is no point in using "as e" and "raise e", the original version was
better.
2026-01-31 11:56:51 +01:00
6b8686351a defaults : Capitalize the c of Copyright 2026-01-30 19:12:37 +01:00
dcca9c5167 Merge branch 'errors' into dev
Raise Exceptions instead of using return codes
2026-01-30 17:43:01 +01:00
041ede22e1 paths : make TimeoutExpired handling clearer 2026-01-30 17:41:53 +01:00
adfded92d0 synchroniser : raise error instead of returning a value
Raise a FatalSyncError when the synchronisation fails instead of
returning the unison return code.
2026-01-30 17:40:41 +01:00
7fae1b154a errors : Derive Errors from Exception
According to the docs user defined Exceptions should derive from
Exception not BaseException.
2026-01-30 17:39:28 +01:00
3dbd7fc445 doc : remove extra whitespace 2026-01-30 17:14:35 +01:00
10a79554d3 synchroniser : use prefer instead of force
When doing a forced synchronisation use the prefer directive instead of
force. This makes unison choose the remote version in case of conflicts
only and not for every change. This allows the add subcommand to be used
for adding a file to the sync (that is already present remotly) as well
as adding a brand new file from the local machine (after creating it or
downloading from somewhere for example).
2026-01-30 17:06:55 +01:00
f2b676043c Merge branch 'linter' into dev
Add pylint to the project and fix some of its reports.
2026-01-30 15:31:33 +01:00
24bc6bcc94 Merge branch 'doc' into dev
This adds documentation options using sphinx.
The doc still mostly needs to be written
2026-01-30 15:29:15 +01:00
7dd01260b3 runners : linter reports cleanup
del an unused parameter and remove a whitespace
2026-01-30 15:19:46 +01:00
2dafcc8c6b paths : linter reports cleanup
use [] instead of list()
2026-01-28 18:22:37 +01:00
cbfbb32b86 main : linter reports cleanup
use is not None instead of != None and remove a useless pass
2026-01-28 18:21:38 +01:00
942b6c3cef main : import PathsManager instead of * 2026-01-28 18:21:13 +01:00
a281fab8db defaults : remove trailing whitespace 2026-01-28 16:21:30 +01:00
033de7e7ca config : cleanup linter warnings 2026-01-28 16:20:40 +01:00
405e978796 config : reraise exception instead of raising a new one 2026-01-28 16:19:30 +01:00
68c03c18d5 config : remove unused import 2026-01-28 16:19:06 +01:00
d0cd6353d7 pyproject.toml : Add pylint as dev dependency 2026-01-28 16:14:25 +01:00
9fd70deb9d gitignore : add dist folder
The dist folder is created when running poetry build ignore it.
2026-01-23 19:53:00 +01:00
dd042910a9 synchroniser : update_links : use run instead of Popen
using Popen causes the command to return before making the connection so
the closing of the master connection (in runners) happens before we had
time to log into the server forcing us to complete an extra login step.
2026-01-23 17:33:41 +01:00
fd825f7e87 synchroniser : use -S to avoid login in
We create a master ssh connection the -S flag uses this to avoid having
to log in again.
2026-01-23 17:29:40 +01:00
c7f0a67f17 runners : add synchronisation to unisync_add
unisync_add was missing synchronisation steps that allow to
synchronise a new file (present on the server but as a link locally).
This adds the two necessary synchronisations, the first one so all the
files are up to date and are not overwritten by the second one which
forces all the changes from the remote to overwrite the local ones. This
has the effect to replace the link with the actual file.
2026-01-21 10:33:52 +01:00
7705731dd5 docs : add explanation of what is unisync
also add an example
2026-01-20 23:39:48 +01:00
a922eaa542 synchroniser : use exception instead of return codes
In create_ssh_master_connection return codes where used instead of
proper error handling with exception, replace these codes with the
raising of an appropriate exception.
2026-01-20 22:23:09 +01:00
8836a0120b errors : adds UnknownSSHError
This error is used to report that an unknown error happened during an
invocation of ssh.
2026-01-20 22:22:00 +01:00
23a661107e Merge branch 'backup' into dev
Adds the possibility to use the backup function of unison
2026-01-20 10:59:47 +01:00
cf508eb94c main : pass the backup options to the synchroniser 2026-01-20 10:48:44 +01:00
5ec43f9166 synchroniser : move backup options to init
Moves the backup options from sync_files to init.
The options are needed in links (to ignore the backup folders)
so it is way easier to have them as attributes.
To do this we move everything related to backup into __init__.
Also remove the option from the runner.
2026-01-20 10:33:13 +01:00
cf49ffb8e8 synchroniser : fix broken synchronisation
Append was used instead of extend which made a list inside of a list
instead of appending the content at the end fix that.
Convert backup.maxbackups to str as needed for subprocess.
2026-01-09 18:31:00 +01:00
c34d30a006 defaults : switch prefix and suffix
I mixed up the prefix and suffix, fix that
2026-01-08 14:19:03 +01:00
bb05990464 runners : pass config.backup to sync_files
After adding the backup infrastructure to config and synchroniser the
only thing left to do is pass the BackupConfig to sync_files.
2026-01-08 14:13:14 +01:00
aaa4a8f12c runners : delete unused arguments
Use the del keyword for unused functions arguments in runners.
All the runners must have the same signature however some do not use
all of the provided arguments so we delete them so the developement
tools do not generate warnings.
2026-01-08 14:06:36 +01:00
56da79f124 runners, main : pass the config to the runners
Some of the runners need the configuration to perform their task.
So pass it to all of them and edit the call in main to reflect this
change.
2026-01-08 14:04:05 +01:00
0e8d568fea main : Use pathlib instead of os.path
Removes every use of os.path and replaces it with the equivalent pathlib
method.
Allows to avoid importing os.
2026-01-08 13:46:01 +01:00
2ae9c38627 tests : add some simple code to run a few tests 2026-01-07 23:35:26 +01:00
667c418f09 synchroniser : add backup to sync_files
Adds the option to enable backup when synchronising.
This is done in sync_files by passing the appropriate arguments to sync.
For this we need to add an argument to sync_files as the backup
configuration options are needed.
The configuration options are imported from unisync.config.BackupConfig.
Also import typing.cast to be able to narrow down a type.
2026-01-07 23:32:24 +01:00
f618932584 synchroniser : add arbitrary synchronisation arguments
Add the option to give arbitrary arguments to the unison call.
These arguments must be passed as a list to sync and will be given to
unison as is.
This is a prerequisite for using the backup system of unison as the
arguments for backup will only be given when synchronising the files and
not the links.
2026-01-07 23:27:48 +01:00
f5e455fc79 config, defaults: add configuration for backups
Add configuration options for creating backups during the
synchronisation.
2026-01-05 17:17:41 +01:00
78a4d9df36 gitignore : ignore docs/build
The docs will be added later but to prevent the mess when switching
between branches ignore the build folder.
2026-01-04 19:22:04 +01:00
e639c12c20 docs : Add sphinx for handling documentation
Edit gitignore by ignoring the docs/build directory
Add sphinx dependencies to pyproject
Add docs folder
2026-01-04 19:18:57 +01:00
c10077392e Change TODOs format.
Use TODO: instead of TODO
2026-01-04 19:18:18 +01:00
7dd7b57e1f synchroniser : Use a consistent docstring format.
Edit the docstrings so they use a consistent format.
Also add a short module docstring.
2026-01-04 14:31:16 +01:00
b10ed69d59 defaults : change type of MISC_CACHE_DIR_PATH to str
DEFAULT_MISC_CACHE_DIR_PATH was a Path but the fallbacks of config.get
in config.py will be converted to a string so make it a string instead
and do the conversion later
2026-01-04 12:22:21 +01:00
ec8030fc81 config : fix cache_dir_path value parsing error
Configparser's config.get returns a string and we want a Path. for the
moment convert it to Path directly.
2026-01-03 18:05:43 +01:00
f050dcc94f runners : fix sync runner not synchronising links
The wrong function was call in the sync runner (update_links instead of
sync_links) which mean the links were updated remotly but never
synchronised with the local.
Call sync_links instead.
We keep the call to update_links but set it to be in background.
2026-01-03 18:02:20 +01:00
f40a5c9276 Merge branch 'abstract_defaults'
Abstract the defaults into a seperate file
2026-01-03 17:20:18 +01:00
0e80ba0b0d config : use the defaults from defaults.py
Remove the defaults from the dataclasses as they are redundent with the
fallbacks of configparser.
Use the values in defaults.py as the fallbacks instead of hardcoded
values.
2026-01-03 17:18:19 +01:00
a223f04909 config : take cache_dir_path into account
cache_dir_path and all of the OtherConfig was ignored and the default
value was loaded, read its value from the config file instead.
2026-01-03 17:15:22 +01:00
e42ae71862 defaults : Create defaults.py
Creates the file defaults.py this is used to store the defaults and
easily include them into the config.
Changing defaults is thus possible without touching the code leaving
less room for errors.
2026-01-03 17:10:06 +01:00
58c7f7d1be Add main as a script with poetry 2026-01-03 16:39:00 +01:00
eefb21faff Mark local imports as such.
Prefix local imports with "unisync." so they are not mistaken with
external modules imports
2026-01-03 16:24:58 +01:00
941c467fc2 Bump copyright year and add missing file headers 2026-01-02 10:58:27 +01:00
4dcab777ca Merge branch 'dev'
Got to a state that seems stable enough to go into main
2026-01-02 10:45:00 +01:00
a169890351 main : adds subcommands, move to Path and improve
Multiple changes to the main file, after this unisync becomes kind of
usable.
Add subcommands : this uses the 2 previous commits to add the
subcommands to unisync it is now possible to sync, add and mount.
pathlib : move from PosixPath to Path
Remove unused imports
Rename base_namespace to cli_args
Add some comments and TODOs
2026-01-02 10:44:59 +01:00
b70070ba1a argparser : adds subcommands to the argparser
this adds subcommands to the argparser using subparsers, we also set a
default value for func depending on which of the subcommands is
selected.
Also change the formatting of the epilog so it is on two lines.
2026-01-02 10:44:58 +01:00
bd72d740e6 runners : Create runners file and basic runnners
This adds runners.py, it contains a set of functions that peform all the
various task that unisync can do (sync, add and mount for now).
They are simple function that put together all the rest.
2026-01-02 10:44:57 +01:00
e43c16adb3 paths : fixes write_new_paths writing of the file
I was writing the file using 'w' instead of 'a' so the old paths were
deleted use 'a'.
2026-01-02 10:44:56 +01:00
10200fceb9 config : fallback to port 22 instead of None
The configparser fallback option for port was None set it to use 22
instead as None doesn't make sense
2026-01-02 10:44:55 +01:00
138bc6d24a Update README to reflect the state of the project 2025-12-31 00:04:53 +01:00
20 changed files with 647 additions and 118 deletions

2
.gitignore vendored
View File

@@ -1,2 +1,4 @@
poetry.lock
__pycache__
docs/build
dist/

View File

@@ -23,5 +23,4 @@ The issue is that you need to know what data is stored on the server to avoid co
# Developement
Unisync was at first a simple bash script but as it grew more complex I started struggling to maintain it which is why I am porting it to python. It will make everything more robust, easier to maintain and to add functionalities.
I am in the early stages of the developement process, this should be usable in the upcoming weeks.
Help will be welcome in the future but is not desirable right now as I want to shape this the way I want to.
I am in the early stages of the developement process, this should be usable someday (hopefully).

20
docs/Makefile Normal file
View File

@@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = source
BUILDDIR = build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

35
docs/make.bat Normal file
View File

@@ -0,0 +1,35 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=source
set BUILDDIR=build
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.https://www.sphinx-doc.org/
exit /b 1
)
if "%1" == "" goto help
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
:end
popd

36
docs/source/conf.py Normal file
View File

@@ -0,0 +1,36 @@
# Configuration file for the Sphinx documentation builder.
#
# For the full list of built-in configuration values, see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
# -- Project information -----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information
project = 'unisync'
copyright = '2026, Paul Retourné'
author = 'Paul Retourné'
release = '0.1.0'
# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
extensions = [
'sphinx.ext.autodoc',
'sphinx.ext.viewcode',
'sphinx.ext.napoleon',
'sphinx.ext.todo'
]
templates_path = ['_templates']
exclude_patterns = []
# -- Options for HTML output -------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output
#html_theme = 'alabaster'
html_theme = 'sphinx_rtd_theme'
html_static_path = ['_static']
autodoc_docstring_signature = True

24
docs/source/example.rst Normal file
View File

@@ -0,0 +1,24 @@
.. _example_how_it_works:
Example of how unisync works
============================
Let's say you have the following structure::
$ tree .
.
├── big_file
└── folder
   ├── file
   └── other_file
If you only want to synchronise `folder` and its content on your laptop the following will be automatically generated::
$ tree .
.
├── big_file -> ../.data/big_file
└── folder
   ├── file
   └── other_file
`big_file` is now a symbolic link and by mounting the remote directory you can still seemlessly access `big_file` through the network.

31
docs/source/index.rst Normal file
View File

@@ -0,0 +1,31 @@
.. unisync documentation master file, created by
sphinx-quickstart on Sun Jan 4 15:02:58 2026.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Documentation for unisync
=========================
Unisync is a data synchronising tool built around `unison`_ and expending on it.
Unisync tries to solve two problems that are often solved separately but never together :
* Keeping your data synchronised between multiple machines (through a central server), examples of this are rsync and of course unison.
* Being able to access and edit files stored on your server without having to download them, the gui interface of nextcloud for example.
* And of course I want to be able to do all of this without ever having to leave my terminal.
Unisync solves this problem by placing each file on your local machine but with only the selected files and folders being physically present on your drive,
the others are replaced by symbolic links pointing to a directory that is mounted from your server.
See this
:ref:`example_how_it_works`.
.. _unison: https://github.com/bcpierce00/unison
.. toctree::
:maxdepth: 2
:caption: Contents:
example
modules

7
docs/source/modules.rst Normal file
View File

@@ -0,0 +1,7 @@
unisync
=======
.. toctree::
:maxdepth: 4
unisync

77
docs/source/unisync.rst Normal file
View File

@@ -0,0 +1,77 @@
unisync package
===============
Submodules
----------
unisync.argparser module
------------------------
.. automodule:: unisync.argparser
:members:
:show-inheritance:
:undoc-members:
unisync.config module
---------------------
.. automodule:: unisync.config
:members:
:show-inheritance:
:undoc-members:
unisync.defaults module
-----------------------
.. automodule:: unisync.defaults
:members:
:show-inheritance:
:undoc-members:
unisync.errors module
---------------------
.. automodule:: unisync.errors
:members:
:show-inheritance:
:undoc-members:
unisync.main module
-------------------
.. automodule:: unisync.main
:members:
:show-inheritance:
:undoc-members:
unisync.paths module
--------------------
.. automodule:: unisync.paths
:members:
:show-inheritance:
:undoc-members:
unisync.runners module
----------------------
.. automodule:: unisync.runners
:members:
:show-inheritance:
:undoc-members:
unisync.synchroniser module
---------------------------
.. automodule:: unisync.synchroniser
:members:
:show-inheritance:
:undoc-members:
Module contents
---------------
.. automodule:: unisync
:members:
:show-inheritance:
:undoc-members:

View File

@@ -10,6 +10,9 @@ requires-python = ">=3.13"
dependencies = [
]
[project.scripts]
unisync = "unisync.main:main"
[tool.poetry]
packages = [{include = "unisync", from = "src"}]
@@ -17,3 +20,12 @@ packages = [{include = "unisync", from = "src"}]
[build-system]
requires = ["poetry-core>=2.0.0,<3.0.0"]
build-backend = "poetry.core.masonry.api"
[dependency-groups]
docs = [
"sphinx (>=9.1.0,<10.0.0)",
"sphinx-rtd-theme (>=3.0.2,<4.0.0)",
]
dev = [
"pylint (>=4.0.4,<5.0.0)"
]

View File

@@ -1,22 +1,36 @@
# Copyright (C) 2025 Paul Retourné
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
import argparse
def create_argparser() -> argparse.ArgumentParser:
def create_argparser(sync_function, add_function, mount_function) -> argparse.ArgumentParser:
"""
Creates an argument parser to parse the command line arguments.
We use subparsers and set a default function for each to perform the correct action.
"""
parser = argparse.ArgumentParser(
prog='unisync',
description='File synchronisation application',
epilog="""
Copyright © 2025 Paul Retourné.
License GPLv3+: GNU GPL version 3 or later <https://gnu.org/licenses/gpl.html>."""
epilog="Copyright © 2025-2026 Paul Retourné.\n"
"License GPLv3+: GNU GPL version 3 or later <https://gnu.org/licenses/gpl.html>.",
formatter_class=argparse.RawDescriptionHelpFormatter
)
parser.add_argument("local", nargs="?")
parser.add_argument("remote", nargs="?")
parser.set_defaults(func=sync_function)
remote_addr_group = parser.add_mutually_exclusive_group()
remote_addr_group.add_argument("--ip")
remote_addr_group.add_argument("--hostname")
parser.add_argument("--config", help="Path to the configuration file", metavar="path_to_config")
subparsers = parser.add_subparsers(help='Actions other than synchronisation')
parser_add = subparsers.add_parser('add', help='Add files to be synchronised.')
parser_add.set_defaults(func=add_function)
parser_mount = subparsers.add_parser('mount', help='Mount the remote.')
parser_mount.set_defaults(func=mount_function)
return parser

View File

@@ -1,22 +1,24 @@
# Copyright (C) 2025 Paul Retourné
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
from configparser import UNNAMED_SECTION
from dataclasses import dataclass, field
from dataclasses import dataclass
from pathlib import Path
import ipaddress
import configparser
from unisync.defaults import *
@dataclass
class ServerConfig:
"""
Dataclass keeping the config for connecting to the server
"""
user: str
sshargs: str = ""
hostname: str = ""
ip: str = ""
port: int | None = 22
sshargs: str
hostname: str
ip: str
port: int
def __post_init__(self):
"""
@@ -28,8 +30,8 @@ class ServerConfig:
if self.ip != "":
try:
ipaddress.ip_address(self.ip)
except ValueError:
raise ValueError("The provided ip address is invalid")
except ValueError as e:
raise ValueError("The provided ip address is invalid") from e
@dataclass
class RootsConfig:
@@ -44,15 +46,27 @@ class UnisonConfig:
"""
Dataclass keeping unison specific configurations
"""
bools: list = field(default_factory=list)
values: dict = field(default_factory=dict)
bools: list
values: dict
@dataclass
class BackupConfig:
"""
Configuration options relative to backing up the files.
"""
enabled: bool
selection: str
location: str
max_backups: int
backupsuffix: str
backupprefix: str
@dataclass
class OtherConfig:
"""
Dataclass keeping miscellanous configuration options
"""
cache_dir_path: Path = Path("~/.unisync").expanduser()
cache_dir_path: Path
@dataclass
class Config:
@@ -62,7 +76,8 @@ class Config:
server: ServerConfig
roots: RootsConfig
unison: UnisonConfig
other: OtherConfig = field(default_factory=OtherConfig)
backup: BackupConfig
other: OtherConfig
def load_config(config_path:str) -> Config:
@@ -79,29 +94,42 @@ def load_config(config_path:str) -> Config:
# Check if sections are provided
server_section = "Server" if "Server" in config.sections() else UNNAMED_SECTION
roots_section = "Roots" if "Roots" in config.sections() else UNNAMED_SECTION
backup_section = "Backup"
other_section = "Other" if "Other" in config.sections() else UNNAMED_SECTION
server_config = ServerConfig(
config.get(server_section, "user"),
config.get(server_section, "sshargs", fallback=""),
config.get(server_section, "hostname", fallback=""),
config.get(server_section, "ip", fallback=""),
config.getint(server_section, "port", fallback=None)
config.get(server_section, "sshargs", fallback=DEFAULT_SERVER_SSHARGS),
config.get(server_section, "hostname", fallback=DEFAULT_SERVER_HOSTNAME),
config.get(server_section, "ip", fallback=DEFAULT_SERVER_IP),
config.getint(server_section, "port", fallback=DEFAULT_SERVER_PORT)
)
roots_config = RootsConfig(
config.get(roots_section, "local"),
config.get(roots_section, "local", fallback=DEFAULT_ROOTS_LOCAL),
config.get(roots_section, "remote")
)
backup_config = BackupConfig(
config.getboolean(backup_section, "enabled", fallback=DEFAULT_BACKUP_ENABLED),
config.get(backup_section, "selection", fallback=DEFAULT_BACKUP_SELECTION),
config.get(backup_section, "loction", fallback=DEFAULT_BACKUP_LOC),
config.getint(backup_section, "max_backups", fallback=DEFAULT_BACKUP_MAX_BACKUPS),
config.get(backup_section, "backupsuffix", fallback=DEFAULT_BACKUP_BACKUPSUFFIX),
config.get(backup_section, "backupprefix", fallback=DEFAULT_BACKUP_BACKUPPREFIX)
)
other_config = OtherConfig(
Path(config.get(other_section, "cache_dir_path", fallback=DEFAULT_MISC_CACHE_DIR_PATH)).expanduser()
)
args_bool = list()
args_val = dict()
args_bool = []
args_val = {}
if "Unison" in config.sections():
for key, val in config.items("Unison"):
if key in config["DEFAULT"].keys():
continue
elif val == "" or val == None:
if val in ("", None):
args_bool.append(key)
else:
args_val[key] = val
unison_config = UnisonConfig(args_bool, args_val)
return Config(server_config, roots_config, unison_config)
return Config(server_config, roots_config, unison_config, backup_config, other_config)

25
src/unisync/defaults.py Normal file
View File

@@ -0,0 +1,25 @@
# Copyright (c) 2026 paul retourné
# spdx-license-identifier: gpl-3.0-or-later
from pathlib import Path
# Commented out values are part of the config but are required so there is no defaults.
# This allows this file to be a list of all the config options.
# DEFAULT_SERVER_USER: str = ""
DEFAULT_SERVER_SSHARGS: str = ""
DEFAULT_SERVER_HOSTNAME: str = ""
DEFAULT_SERVER_IP: str = ""
DEFAULT_SERVER_PORT: int = 22
DEFAULT_ROOTS_LOCAL: str = str(Path("~/files").expanduser())
# DEFAULT_ROOTS_REMOTE: str = ""
DEFAULT_MISC_CACHE_DIR_PATH: str = "~/.unisync"
DEFAULT_BACKUP_ENABLED: bool = False
DEFAULT_BACKUP_SELECTION: str = ""
DEFAULT_BACKUP_LOC: str = "local"
DEFAULT_BACKUP_MAX_BACKUPS: int = 2
DEFAULT_BACKUP_BACKUPSUFFIX: str = ".$VERSION.bak"
DEFAULT_BACKUP_BACKUPPREFIX: str = ".unison_backups/"

View File

@@ -1,5 +1,21 @@
class RemoteMountedError(BaseException):
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
from typing import NoReturn
import sys
class RemoteMountedError(Exception):
pass
class InvalidMountError(BaseException):
class InvalidMountError(Exception):
pass
class UnknownSSHError(Exception):
pass
class FatalSyncError(Exception):
pass
def unisync_exit_fatal(reason:str) -> NoReturn:
print(reason)
sys.exit(1)

View File

@@ -1,25 +1,30 @@
# Copyright (C) 2025 Paul Retourné
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
import os
from argparser import create_argparser
from config import RootsConfig, ServerConfig, Config, load_config
from synchroniser import Synchroniser
from pathlib import Path, PosixPath
from paths import *
from pathlib import Path
from unisync.argparser import create_argparser
from unisync.errors import UnknownSSHError, unisync_exit_fatal
from unisync.runners import unisync_sync, unisync_add, unisync_mount
from unisync.config import load_config
from unisync.synchroniser import Synchroniser
from unisync.paths import PathsManager
def main():
parser = create_argparser()
base_namespace = parser.parse_args()
parser = create_argparser(unisync_sync, unisync_add, unisync_mount)
cli_args = parser.parse_args()
config_path = os.path.expanduser("~/.config/unisync/config.ini")
if base_namespace.config != None and os.path.isfile(base_namespace.config):
config = load_config(base_namespace.config)
elif os.path.isfile(config_path):
config = load_config(config_path)
config_path: Path = Path("~/.config/unisync/config.ini").expanduser()
# Check if --config is set
if cli_args.config is not None and Path(cli_args.config).is_file():
config = load_config(cli_args.config)
elif config_path.is_file():
config = load_config(str(config_path))
else:
# TODO make the command line arguments work and override the config options
pass
# TODO replace the next line with something to do if no config file is found
config = load_config(str(config_path))
# TODO: make the command line arguments work and override the config options
synchroniser = Synchroniser(
config.roots.remote,
@@ -28,23 +33,16 @@ def main():
config.server.ip if config.server.ip != "" else config.server.hostname,
config.server.port,
config.unison.bools,
config.unison.values
config.unison.values,
backup=config.backup
)
paths_manager = PathsManager(Path(config.roots.local), config.other.cache_dir_path)
if synchroniser.create_ssh_master_connection() != 0:
print("Connection failed quitting")
return 1
print("Connected to the remote.")
#synchroniser.sync_files()
#synchroniser.update_links(background=False)
#synchroniser.mount_remote_dir()
synchroniser.close_ssh_master_connection()
print(paths_manager.get_paths_to_sync())
try:
cli_args.func(synchroniser, paths_manager, config)
except UnknownSSHError:
unisync_exit_fatal("Connection failed quitting")
if __name__ == "__main__":

View File

@@ -1,4 +1,4 @@
# Copyright (C) 2025 Paul Retourné
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
@@ -92,7 +92,7 @@ class PathsManager:
Writes a list of new paths to the file
"""
current_paths = self.get_paths_to_sync()
paths_to_add = list()
paths_to_add = []
# Check if one of the parent is already being synchronised
# If so there is no need to add the child path
for new_path in paths:
@@ -106,7 +106,7 @@ class PathsManager:
if not is_contained and new_path not in paths_to_add:
paths_to_add.append(new_path)
with self.paths_file.open("w") as f:
with self.paths_file.open("a") as f:
for p in paths_to_add:
f.write(p + "\n")

43
src/unisync/runners.py Normal file
View File

@@ -0,0 +1,43 @@
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
from unisync.synchroniser import Synchroniser
from unisync.paths import PathsManager
from unisync.config import Config
def unisync_sync(synchroniser:Synchroniser, paths_manager:PathsManager, config: Config):
del config # The function signature must be the same for all runners
synchroniser.create_ssh_master_connection()
print("Connected to the remote.")
synchroniser.sync_files(paths_manager.get_paths_to_sync())
synchroniser.sync_links(paths_manager.get_paths_to_sync())
# TODO check the config options and do or don't do the following
synchroniser.update_links()
#synchroniser.mount_remote_dir()
synchroniser.close_ssh_master_connection()
def unisync_add(synchroniser:Synchroniser, paths_manager:PathsManager, config: Config):
del config # The function signature must be the same for all runners
synchroniser.create_ssh_master_connection()
print("Connected to the remote.")
# TODO config or cli to skip this first sync
synchroniser.sync_files(paths_manager.get_paths_to_sync())
paths_manager.add_files_to_sync()
synchroniser.sync_files(paths_manager.get_paths_to_sync(), force=True)
synchroniser.close_ssh_master_connection()
def unisync_mount(synchroniser:Synchroniser, paths_manager:PathsManager, config: Config):
del paths_manager # The function signature must be the same for all runners
del config # The function signature must be the same for all runners
synchroniser.mount_remote_dir()

View File

@@ -1,6 +1,12 @@
# Copyright (C) 2025 Paul Retourné
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
"""Exports the Synchroniser class.
This class is used to perform all the actions that require a connection to
the remote.
"""
import subprocess
import os
import sys
@@ -8,15 +14,47 @@ import time
import logging
from pathlib import Path
from typing import cast
from errors import RemoteMountedError, InvalidMountError
from unisync.errors import RemoteMountedError, InvalidMountError, UnknownSSHError, FatalSyncError
from unisync.config import BackupConfig
logger = logging.getLogger(__name__)
class Synchroniser:
"""Synchroniser used to synchronise with a server.
def __init__(self, remote:str, local:str, user:str, ip:str,
port:int=22, args_bool:list=[], args_value:dict={}, ssh_settings:dict={}):
It is used to perform every action needing a connection to the remote.
Create an ssh connection.
Perform the various synchronisation steps (files, links).
Update the links on the remote.
Mount the remote directory.
Close the ssh connection.
Attributes:
remote: The directory to synchronise to on the remote.
local: The directory to synchronise from locally.
user: The user on the remote server.
ip: The ip of the remote server.
port: The ssh port on the remote.
args_bool:
A list of boolean arguments for unison.
They will be passed directly to unison when calling it.
For example : auto will be passed as -auto
args_value:
Same as args_bool but for key value arguments.
Will be passed to unison as "-key value".
ssh_settings:
Settings to pass to the underlying ssh connection.
Currently unused.
"""
def __init__(self, remote:str, local:str, user:str, ip:str, port:int=22,
args_bool:list=[], args_value:dict={}, ssh_settings:dict={},
backup:BackupConfig | None = None
):
"""Initialises an instance of Synchroniser.
"""
self.remote_dir:str = remote
self.local:str = local
self.args_bool:list[str] = args_bool
@@ -25,16 +63,56 @@ class Synchroniser:
self.remote_user:str = user
self.remote_ip:str = ip
self.remote_port:int = port
self.files_extra:list = list()
self.links_extra:list = list()
def create_ssh_master_connection(self, control_path:str="~/.ssh/control_%C", connection_timeout:int=60) -> int:
"""
Creates an ssh master connection so the user only has to authenticate once to the remote server.
The subsequent connections will be made through this master connection which speeds up connecting.
@control_path: Set the location of the ssh control socket
@connection_timeout:
Time given to the user to authenticate to the remote server.
On slow connections one might want to increase this.
Returns 0 on success.
if(backup != None and backup.enabled):
backup = cast(BackupConfig, backup)
self.files_extra.append("-backup")
if(backup.selection != ""):
self.files_extra.append(backup.selection)
else:
self.files_extra.append("Name *")
self.files_extra.extend([
"-backuploc",
backup.location,
"-maxbackups",
str(backup.max_backups),
"-backupsuffix",
backup.backupsuffix,
"-backupprefix",
backup.backupprefix,
"-ignore",
f"Name {backup.backupprefix[:-1]}"
])
self.links_extra.extend([
"-ignore",
f"Name {backup.backupprefix[:-1]}"
])
def create_ssh_master_connection(self, control_path:str="~/.ssh/control_%C", connection_timeout:int=60) -> None:
"""Creates an ssh master connection.
It is used so the user only has to authenticate once to the remote server.
The subsequent connections will be made through this master connection
which speeds up connnection.
The users only have to enter their password once per synchronisation.
Args:
control_path: Set the location of the ssh control socket
connection_timeout:
Time given to the user to authenticate to the remote server.
On slow connections one might want to increase this.
Raises:
subprocess.TimeoutExpired:
The user didn't finish loging in in time.
KeyboardInterrupt:
The user interrupted the process.
UnknownSSHError:
An error occured during the connection.
"""
self.control_path = os.path.expanduser(control_path)
command = [
@@ -46,23 +124,25 @@ class Synchroniser:
"-p", str(self.remote_port)
]
master_ssh = subprocess.Popen(command)
# TODO: Raise an exception instead of changing the return value
try:
ret_code = master_ssh.wait(timeout=connection_timeout)
except subprocess.TimeoutExpired:
except subprocess.TimeoutExpired as e:
print("Time to login expired", file=sys.stderr)
return 1
except KeyboardInterrupt:
return 2
raise e
except KeyboardInterrupt as e:
raise e
if ret_code != 0:
print("Login to remote failed", file=sys.stderr)
return ret_code
return 0
raise UnknownSSHError
def close_ssh_master_connection(self) -> int:
"""
Close the ssh master connection.
"""Closes the ssh master connection.
Returns:
The return code of the ssh call.
"""
command = [
"/usr/bin/ssh",
@@ -74,40 +154,68 @@ class Synchroniser:
close = subprocess.Popen(command)
return close.wait()
def sync_files(self, paths:list, force:bool=False) -> int:
def sync_files(self, paths:list, force:bool=False) -> None:
"""Synchronises the files.
Args:
paths: List of paths to synchronise.
force: Force the changes from remote to local.
Raises:
FatalSyncError: A fatal error occured during the synchronisation.
"""
Synchronises the files.
"""
return self.sync(
self.sync(
f"ssh://{self.remote_user}@{self.remote_ip}/{self.remote_dir}/.data",
self.local,
paths=paths,
force=force
force=force,
other=self.files_extra
)
def sync_links(self, ignore:list) -> int:
def sync_links(self, ignore:list) -> None:
"""Synchronises the links, they must exist already.
Args:
ignore: List of paths to ignore.
Raises:
FatalSyncError: A fatal error occured during the synchronisation.
"""
Synchronises the links, they must exist already.
"""
return self.sync(
self.sync(
f"ssh://{self.remote_user}@{self.remote_ip}/{self.remote_dir}/links",
self.local,
ignore=ignore
ignore=ignore,
other=self.links_extra
)
def sync(self, remote_root:str, local_root:str,
paths:list=[], ignore:list=[], force:bool=False) -> int:
"""
Perform the synchronisation by calling unison.
@remote_root: The remote root, must be a full root usable by unison.
@local_root: The local root, must be a full root usable by unison.
@paths: List of paths to synchronise
@ignore: List of paths to ignore
The paths and everything under them will be ignored.
If you need to ignore some specific files use the arguments.
@force: Force all changes from remote to local.
Used mostly when replacing a link by the file.
Returns: the unison return code see section 6.11 of the documentation
paths:list=[], ignore:list=[], force:bool=False,
other:list=[]
) -> None:
"""Performs the synchronisation by calling unison.
Args:
remote_root: The remote root, must be a full root usable by unison.
local_root: The local root, must be a full root usable by unison.
paths: List of paths to synchronise
ignore: List of paths to ignore
The paths and everything under them will be ignored.
If you need to ignore some specific files use the arguments.
force: Force all changes from remote to local.
Used mostly when replacing a link by the file.
other:
Other arguments to add to unison.
These arguments will only be used for this sync which is not
the case for the ones in self.args_bool and self.args_value.
They will be added to the command as is no - in front.
For exemple backups are implemented using this argument.
Raises:
FatalSyncError:
If unison returns 3 it means either a fatal error occured or the synchronisation
was interrupted.
If this happens propagate the error to unisync.
"""
command = [ "/usr/bin/unison", "-root", remote_root, "-root", local_root ]
for arg in self.args_bool:
@@ -117,6 +225,7 @@ class Synchroniser:
command.append(value)
sshargs = f"-p {self.remote_port} "
sshargs += f"-S {self.control_path} "
for arg, value in self.ssh_settings.items():
sshargs += arg + " " + value + " "
command.append("-sshargs")
@@ -131,21 +240,26 @@ class Synchroniser:
command.append(f"BelowPath {path}")
if force:
command.append("-force")
command.append("-prefer")
command.append(remote_root)
command.append("-batch")
for arg in other:
command.append(arg)
proc = subprocess.Popen(command)
ret_code = proc.wait()
return ret_code
if ret_code == 3:
raise FatalSyncError("Synchronisation could not be completed")
def update_links(self, background:bool=True):
"""
Update the links on the remote.
"""Updates the links on the remote.
First calls cleanlinks to remove deadlinks and empty directories.
Then calls lndir to create the new links.
Args:
- background: controls if the update is done in the background or waited for
background: controls if the update is done in the background or waited for.
"""
link_update_script = (f"cd {self.remote_dir}/links && "
@@ -165,7 +279,7 @@ class Synchroniser:
link_background_wrapper
]
link_update_process = subprocess.Popen(command, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
link_update_process = subprocess.run(command, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
if not background:
print("Starting links update.")
@@ -173,13 +287,14 @@ class Synchroniser:
print("Done")
def mount_remote_dir(self):
"""
Mount the remote directory to make the local links work.
This is achieved using sshfs.
Raise:
- RemoteMountedError: The .data directory is already a mount point
- InvalidMountError: .data is either not a directory or not empty
- subprocess.CalledProcessError: An error occured with sshfs
"""Mounts the remote directory to make the local links work.
This is achieved using sshfs which may fail.
Raises:
RemoteMountedError: The .data directory is already a mount point.
InvalidMountError: .data is either not a directory or not empty.
subprocess.CalledProcessError: An error occured with sshfs.
"""
# Get the absolute path to the correct .data directory resolving symlinks
path_to_mount:Path = Path(f"{self.local}/../.data").resolve()

8
tests/runners.py Normal file
View File

@@ -0,0 +1,8 @@
# Copyright (C) 2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
from unisync.synchroniser import Synchroniser
from unisync.paths import PathsManager
def unisync_test(synchroniser:Synchroniser, paths_manager:PathsManager):
print("Testing")

39
tests/test.py Normal file
View File

@@ -0,0 +1,39 @@
# Copyright (C) 2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
import os
from pathlib import Path
from unisync.argparser import create_argparser
from unisync.runners import unisync_sync, unisync_add, unisync_mount
from unisync.config import load_config
from unisync.synchroniser import Synchroniser
from unisync.paths import *
from runners import *
def main():
parser = create_argparser(unisync_test, unisync_add, unisync_mount)
cli_args = parser.parse_args()
config_path = os.path.expanduser("./config.ini")
config = load_config(config_path)
print(config)
synchroniser = Synchroniser(
config.roots.remote,
config.roots.local,
config.server.user,
config.server.ip if config.server.ip != "" else config.server.hostname,
config.server.port,
config.unison.bools,
config.unison.values
)
paths_manager = PathsManager(Path(config.roots.local), config.other.cache_dir_path)
cli_args.func(synchroniser, paths_manager)
if __name__ == "__main__":
main()