Compare commits

...

31 Commits

Author SHA1 Message Date
ec8030fc81 config : fix cache_dir_path value parsing error
Configparser's config.get returns a string and we want a Path. for the
moment convert it to Path directly.
2026-01-03 18:05:43 +01:00
f050dcc94f runners : fix sync runner not synchronising links
The wrong function was call in the sync runner (update_links instead of
sync_links) which mean the links were updated remotly but never
synchronised with the local.
Call sync_links instead.
We keep the call to update_links but set it to be in background.
2026-01-03 18:02:20 +01:00
f40a5c9276 Merge branch 'abstract_defaults'
Abstract the defaults into a seperate file
2026-01-03 17:20:18 +01:00
0e80ba0b0d config : use the defaults from defaults.py
Remove the defaults from the dataclasses as they are redundent with the
fallbacks of configparser.
Use the values in defaults.py as the fallbacks instead of hardcoded
values.
2026-01-03 17:18:19 +01:00
a223f04909 config : take cache_dir_path into account
cache_dir_path and all of the OtherConfig was ignored and the default
value was loaded, read its value from the config file instead.
2026-01-03 17:15:22 +01:00
e42ae71862 defaults : Create defaults.py
Creates the file defaults.py this is used to store the defaults and
easily include them into the config.
Changing defaults is thus possible without touching the code leaving
less room for errors.
2026-01-03 17:10:06 +01:00
58c7f7d1be Add main as a script with poetry 2026-01-03 16:39:00 +01:00
eefb21faff Mark local imports as such.
Prefix local imports with "unisync." so they are not mistaken with
external modules imports
2026-01-03 16:24:58 +01:00
941c467fc2 Bump copyright year and add missing file headers 2026-01-02 10:58:27 +01:00
4dcab777ca Merge branch 'dev'
Got to a state that seems stable enough to go into main
2026-01-02 10:45:00 +01:00
a169890351 main : adds subcommands, move to Path and improve
Multiple changes to the main file, after this unisync becomes kind of
usable.
Add subcommands : this uses the 2 previous commits to add the
subcommands to unisync it is now possible to sync, add and mount.
pathlib : move from PosixPath to Path
Remove unused imports
Rename base_namespace to cli_args
Add some comments and TODOs
2026-01-02 10:44:59 +01:00
b70070ba1a argparser : adds subcommands to the argparser
this adds subcommands to the argparser using subparsers, we also set a
default value for func depending on which of the subcommands is
selected.
Also change the formatting of the epilog so it is on two lines.
2026-01-02 10:44:58 +01:00
bd72d740e6 runners : Create runners file and basic runnners
This adds runners.py, it contains a set of functions that peform all the
various task that unisync can do (sync, add and mount for now).
They are simple function that put together all the rest.
2026-01-02 10:44:57 +01:00
e43c16adb3 paths : fixes write_new_paths writing of the file
I was writing the file using 'w' instead of 'a' so the old paths were
deleted use 'a'.
2026-01-02 10:44:56 +01:00
10200fceb9 config : fallback to port 22 instead of None
The configparser fallback option for port was None set it to use 22
instead as None doesn't make sense
2026-01-02 10:44:55 +01:00
27924013d9 Bug fixes and small improvements
Fix :
- paths : true instead of True
- paths : Path has no len convert to str first to get the number of
  characters

Improvements :
- Replace all PosixPath by Path
2026-01-01 17:24:46 +01:00
138bc6d24a Update README to reflect the state of the project 2025-12-31 00:04:53 +01:00
48179034a7 Adds prerequisites to README 2025-12-31 00:03:59 +01:00
f9001ecb9d Adds usage of the paths manager 2025-12-31 00:03:17 +01:00
86a6c8acce Adds error handling for the paths 2025-12-30 17:56:03 +01:00
4f6f48247d Add classes for error handling 2025-12-30 17:54:47 +01:00
8caba75060 Adds paths adding functionnality
Adds functions that allows adding new paths to the synchronisation.
When writing the new paths to the file if a parent directory is
synchronised all the childrens are removed.
2025-08-10 18:21:10 +02:00
b35391f1f9 Adds an Other category to the config
This creates a new optionnal category in the config called other that is
used to configure various aspects of unison.
Currently it only allows to customise the path to the cache directory
which is ~/.unisync by default
2025-07-31 11:47:44 +02:00
c5992ef19e Adds get_paths_to_sync and organise it in a class
This refactors the paths functions in a class called PathsManager
allowing to share some data like the Paths to the various directories
unisync works with.
This commit also creates the get_paths_to_sync method which simply reads
the paths file and returns its content as a list
2025-07-31 11:45:17 +02:00
837cc1bcf4 Adds mount_remote_dir
Adds the mount_remote_dir method to the synchroniser, this allows to
mount the remote directory in order to access it with the generated
links.
Also adds the background parameter to the documentation of update_links
2025-07-28 15:31:40 +02:00
87db8a0498 Adds links generation and update
Adds update_links to the synchroniser which updates the links.
It should also be able to generate links on the first run.
2025-07-27 17:46:37 +02:00
11513adf48 Improves user_select_files documentation 2025-07-25 12:02:25 +02:00
aaa4ef61d5 Adds beginning of paths management
The paths file will be used for everything related to the paths to
synchronise.
Adds the user_select_files functions that allows the user to select
paths
2025-07-24 15:59:42 +02:00
fec09b6d0b Adds comments to the dataclasses in config 2025-07-24 15:48:13 +02:00
2566458e25 Complete README 2025-07-11 23:21:41 +02:00
14eb531e4a Adds unison config and test code
This adds the possiblity to pass configuration options directly to
unison via the configuration file.
Also adds some test code to main.py
2025-07-11 00:30:05 +02:00
10 changed files with 380 additions and 30 deletions

View File

@@ -1,4 +1,26 @@
Unisync is a data synchronisation tool written in python and based on [unison](https://github.com/bcpierce00/unison). Unisync is a data synchronisation tool written in python and based on [unison](https://github.com/bcpierce00/unison).
The goal is to be able to keep data synchronised between multiple computers without needing to have all the data kept locally while at the same time being able to access everything. I couldn't find a tool to fulfill the requirements I had for a synchronisation tool so I am creating my own as a wrapper around unison.
The development just started so the documentation will be written later. # Prerequisite
You need to have the following tools installed.
Locally :
- unison
- sshfs
- nnn
Remotely :
- unison
- cleanlinks and lndir (Should be in `xutils-dev` when using apt)
# Goal
Unisync purpose is to keep personal data synchronised between multiple machines without needing to have all the data present an all the machines at the same time. For example you might not need to have your movies on your laptop but still want them on your desktop at home or you might want to keep your old pictures only on a server.
Unisync requires you to have a "server" (like a NAS at home) that will store all your data allowing you to only copy what you need when you need it.
The issue is that you need to know what data is stored on the server to avoid conflict if creating duplicate files or folders. To address this unisync places a symlink for every file you do not wish to keep locally and allows you to mount the remote filesystem (using sshfs) allowing you to access files that aren't synchronised.
# Developement
Unisync was at first a simple bash script but as it grew more complex I started struggling to maintain it which is why I am porting it to python. It will make everything more robust, easier to maintain and to add functionalities.
I am in the early stages of the developement process, this should be usable someday (hopefully).

View File

@@ -10,6 +10,9 @@ requires-python = ">=3.13"
dependencies = [ dependencies = [
] ]
[project.scripts]
unisync = "unisync.main:main"
[tool.poetry] [tool.poetry]
packages = [{include = "unisync", from = "src"}] packages = [{include = "unisync", from = "src"}]

View File

@@ -1,22 +1,36 @@
# Copyright (C) 2025 Paul Retourné # Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later # SPDX-License-Identifier: GPL-3.0-or-later
import argparse import argparse
def create_argparser() -> argparse.ArgumentParser: def create_argparser(sync_function, add_function, mount_function) -> argparse.ArgumentParser:
"""
Creates an argument parser to parse the command line arguments.
We use subparsers and set a default function for each to perform the correct action.
"""
parser = argparse.ArgumentParser( parser = argparse.ArgumentParser(
prog='unisync', prog='unisync',
description='File synchronisation application', description='File synchronisation application',
epilog=""" epilog="Copyright © 2025-2026 Paul Retourné.\n"
Copyright © 2025 Paul Retourné. "License GPLv3+: GNU GPL version 3 or later <https://gnu.org/licenses/gpl.html>.",
License GPLv3+: GNU GPL version 3 or later <https://gnu.org/licenses/gpl.html>.""" formatter_class=argparse.RawDescriptionHelpFormatter
) )
parser.add_argument("local", nargs="?") parser.add_argument("local", nargs="?")
parser.add_argument("remote", nargs="?") parser.add_argument("remote", nargs="?")
parser.set_defaults(func=sync_function)
remote_addr_group = parser.add_mutually_exclusive_group() remote_addr_group = parser.add_mutually_exclusive_group()
remote_addr_group.add_argument("--ip") remote_addr_group.add_argument("--ip")
remote_addr_group.add_argument("--hostname") remote_addr_group.add_argument("--hostname")
parser.add_argument("--config", help="Path to the configuration file", metavar="path_to_config") parser.add_argument("--config", help="Path to the configuration file", metavar="path_to_config")
subparsers = parser.add_subparsers(help='Actions other than synchronisation')
parser_add = subparsers.add_parser('add', help='Add files to be synchronised.')
parser_add.set_defaults(func=add_function)
parser_mount = subparsers.add_parser('mount', help='Mount the remote.')
parser_mount.set_defaults(func=mount_function)
return parser return parser

View File

@@ -1,20 +1,29 @@
# Copyright (C) 2025 Paul Retourné # Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later # SPDX-License-Identifier: GPL-3.0-or-later
from configparser import UNNAMED_SECTION
from dataclasses import dataclass, field from dataclasses import dataclass, field
from pathlib import Path
import ipaddress import ipaddress
import configparser import configparser
from configparser import UNNAMED_SECTION
from unisync.defaults import *
@dataclass @dataclass
class ServerConfig: class ServerConfig:
"""
Dataclass keeping the config for connecting to the server
"""
user: str user: str
sshargs: list[str] | None = field(default_factory=list) sshargs: str
hostname: str = "" hostname: str
ip: str = "" ip: str
port: int = 22 port: int
def __post_init__(self): def __post_init__(self):
"""
Make sure a remote is provided and the ip address is valid
"""
if self.ip == "" and self.hostname == "": if self.ip == "" and self.hostname == "":
raise ValueError("A remote must be provided (ip or hostname)") raise ValueError("A remote must be provided (ip or hostname)")
@@ -26,13 +35,37 @@ class ServerConfig:
@dataclass @dataclass
class RootsConfig: class RootsConfig:
"""
Dataclass keeping the paths to the roots to synchronise
"""
local: str local: str
remote: str remote: str
@dataclass
class UnisonConfig:
"""
Dataclass keeping unison specific configurations
"""
bools: list
values: dict
@dataclass
class OtherConfig:
"""
Dataclass keeping miscellanous configuration options
"""
cache_dir_path: Path
@dataclass @dataclass
class Config: class Config:
"""
Main dataclass for the configurations
"""
server: ServerConfig server: ServerConfig
roots: RootsConfig roots: RootsConfig
unison: UnisonConfig
other: OtherConfig
def load_config(config_path:str) -> Config: def load_config(config_path:str) -> Config:
""" """
@@ -42,22 +75,39 @@ def load_config(config_path:str) -> Config:
Returns: Returns:
Config: A populated Config object containing the loaded config. Config: A populated Config object containing the loaded config.
""" """
config = configparser.ConfigParser(allow_unnamed_section=True) config = configparser.ConfigParser(allow_unnamed_section=True, allow_no_value=True)
config.read(config_path) config.read(config_path)
# Check if sections are provided # Check if sections are provided
server_section = "Server" if "Server" in config.sections() else UNNAMED_SECTION server_section = "Server" if "Server" in config.sections() else UNNAMED_SECTION
roots_section = "Roots" if "Roots" in config.sections() else UNNAMED_SECTION roots_section = "Roots" if "Roots" in config.sections() else UNNAMED_SECTION
other_section = "Other" if "Other" in config.sections() else UNNAMED_SECTION
server_config = ServerConfig( server_config = ServerConfig(
config.get(server_section, "user"), config.get(server_section, "user"),
config.get(server_section, "sshargs", fallback=None), config.get(server_section, "sshargs", fallback=DEFAULT_SERVER_SSHARGS),
config.get(server_section, "hostname", fallback=None), config.get(server_section, "hostname", fallback=DEFAULT_SERVER_HOSTNAME),
config.get(server_section, "ip", fallback=None), config.get(server_section, "ip", fallback=DEFAULT_SERVER_IP),
config.getint(server_section, "port", fallback=None) config.getint(server_section, "port", fallback=DEFAULT_SERVER_PORT)
) )
roots_config = RootsConfig( roots_config = RootsConfig(
config.get(roots_section, "local"), config.get(roots_section, "local", fallback=DEFAULT_ROOTS_LOCAL),
config.get(roots_section, "remote") config.get(roots_section, "remote")
) )
return Config(server_config, roots_config) other_config = OtherConfig(
Path(config.get(other_section, "cache_dir_path", fallback=DEFAULT_MISC_CACHE_DIR_PATH)).expanduser()
)
args_bool = list()
args_val = dict()
if "Unison" in config.sections():
for key, val in config.items("Unison"):
if key in config["DEFAULT"].keys():
continue
elif val == "" or val == None:
args_bool.append(key)
else:
args_val[key] = val
unison_config = UnisonConfig(args_bool, args_val)
return Config(server_config, roots_config, unison_config, other_config)

18
src/unisync/defaults.py Normal file
View File

@@ -0,0 +1,18 @@
# copyright (c) 2026 paul retourné
# spdx-license-identifier: gpl-3.0-or-later
from pathlib import Path
# Commented out values are part of the config but are required so there is no defaults.
# This allows this file to be a list of all the config options.
# DEFAULT_SERVER_USER: str = ""
DEFAULT_SERVER_SSHARGS: str = ""
DEFAULT_SERVER_HOSTNAME: str = ""
DEFAULT_SERVER_IP: str = ""
DEFAULT_SERVER_PORT: int = 22
DEFAULT_ROOTS_LOCAL: str = str(Path("~/files").expanduser())
# DEFAULT_ROOTS_REMOTE: str = ""
DEFAULT_MISC_CACHE_DIR_PATH: Path = Path("~/.unisync").expanduser()

8
src/unisync/errors.py Normal file
View File

@@ -0,0 +1,8 @@
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
class RemoteMountedError(BaseException):
pass
class InvalidMountError(BaseException):
pass

View File

@@ -1,22 +1,46 @@
# Copyright (C) 2025 Paul Retourné # Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later # SPDX-License-Identifier: GPL-3.0-or-later
import os import os
from argparser import create_argparser from pathlib import Path
from config import RootsConfig, ServerConfig, Config, load_config
from unisync.argparser import create_argparser
from unisync.runners import unisync_sync, unisync_add, unisync_mount
from unisync.config import load_config
from unisync.synchroniser import Synchroniser
from unisync.paths import *
def main(): def main():
parser = create_argparser() parser = create_argparser(unisync_sync, unisync_add, unisync_mount)
base_namespace = parser.parse_args() cli_args = parser.parse_args()
config_path = os.path.expanduser("~/.config/unisync/config.ini") config_path = os.path.expanduser("~/.config/unisync/config.ini")
if base_namespace.config != None and os.path.isfile(base_namespace.config): # Check if --config is set
config = load_config(base_namespace.config) if cli_args.config != None and os.path.isfile(cli_args.config):
config = load_config(cli_args.config)
elif os.path.isfile(config_path): elif os.path.isfile(config_path):
config = load_config(config_path) config = load_config(config_path)
else: else:
# TODO make the command line arguments work and override the config options # TODO replace the next line with something to do if no config file is found
config = load_config(config_path)
pass pass
# TODO make the command line arguments work and override the config options
synchroniser = Synchroniser(
config.roots.remote,
config.roots.local,
config.server.user,
config.server.ip if config.server.ip != "" else config.server.hostname,
config.server.port,
config.unison.bools,
config.unison.values
)
paths_manager = PathsManager(Path(config.roots.local), config.other.cache_dir_path)
cli_args.func(synchroniser, paths_manager)
if __name__ == "__main__": if __name__ == "__main__":
main() main()

113
src/unisync/paths.py Normal file
View File

@@ -0,0 +1,113 @@
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
import os.path
import subprocess
import sys
from pathlib import Path
class PathsManager:
def __init__(self, local_dir:Path, cache_dir:Path):
"""
Creates a PathsManager with the necessary data
Args:
local_dir: Path to the top directory of the synchronisation
cache_dir: Path to the cache directory that contains the paths file
"""
if not local_dir.is_dir():
raise ValueError("Invalid local directory")
self.local_dir = local_dir
if not cache_dir.is_dir():
raise ValueError("Invalid cache directory")
self.cache_dir = cache_dir
self.paths_file:Path = self.cache_dir / "paths"
if not self.paths_file.is_file():
raise ValueError("The paths file does not exist")
def user_select_files(self, choice_timeout:int=120) -> list[str]:
"""
Make the user select files in the top directory.
Currently uses nnn for the selection.
The goal is to replace it in order to avoid using external programs.
Args:
choice_timeout: Time given to make choices in nnn
Returns:
list[str]: The list of paths that was selected relative to the top directory
Raise:
TimeoutExpired: User took too long to choose
CalledProcessError: An unknown error occured during the selection
"""
command = [
"/usr/bin/nnn",
"-H",
"-p", "-",
self.local_dir
]
nnn_process:subprocess.Popen = subprocess.Popen(command, stdout=subprocess.PIPE)
try:
ret_code = nnn_process.wait(timeout=choice_timeout)
except subprocess.TimeoutExpired as e:
print("Choice timeout expired", file=sys.stderr)
raise e
if ret_code != 0:
print("File selection failed", file=sys.stderr)
raise subprocess.CalledProcessError(1, "File selection failed")
paths_list:list[str] = []
while (next_path := nnn_process.stdout.readline()) != b'':
next_path = next_path.decode().strip()
# Make the path relative to the top directory
next_path = next_path[len(str(self.local_dir)):].lstrip("/")
paths_list.append(next_path)
return paths_list
def add_files_to_sync(self):
while True:
try:
paths = self.user_select_files()
break
except subprocess.TimeoutExpired:
if input("Timeout expired do you want to retry (y/n): ") != "y":
raise
self.write_new_paths(paths)
def get_paths_to_sync(self) -> list[str]:
"""
Return the paths to synchronise as list.
"""
paths:list[str] = self.paths_file.read_text().split("\n")
if paths[-1] == "":
paths.pop()
return paths
def write_new_paths(self, paths:list[str]):
"""
Writes a list of new paths to the file
"""
current_paths = self.get_paths_to_sync()
paths_to_add = list()
# Check if one of the parent is already being synchronised
# If so there is no need to add the child path
for new_path in paths:
is_contained = False
for existing in current_paths:
common = os.path.commonpath([new_path, existing])
if common == existing:
is_contained = True
break
if not is_contained and new_path not in paths_to_add:
paths_to_add.append(new_path)
with self.paths_file.open("a") as f:
for p in paths_to_add:
f.write(p + "\n")

36
src/unisync/runners.py Normal file
View File

@@ -0,0 +1,36 @@
# Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later
from unisync.synchroniser import Synchroniser
from unisync.paths import PathsManager
def unisync_sync(synchroniser:Synchroniser, paths_manager:PathsManager):
if synchroniser.create_ssh_master_connection() != 0:
print("Connection failed quitting")
return 1
print("Connected to the remote.")
synchroniser.sync_files(paths_manager.get_paths_to_sync())
synchroniser.sync_links(paths_manager.get_paths_to_sync())
# TODO check the config options and do or don't do the following
synchroniser.update_links()
#synchroniser.mount_remote_dir()
synchroniser.close_ssh_master_connection()
def unisync_add(synchroniser:Synchroniser, paths_manager:PathsManager):
if synchroniser.create_ssh_master_connection() != 0:
print("Connection failed quitting")
return 1
print("Connected to the remote.")
paths_manager.add_files_to_sync()
synchroniser.close_ssh_master_connection()
def unisync_mount(synchroniser:Synchroniser, paths_manager:PathsManager):
synchroniser.mount_remote_dir()

View File

@@ -1,4 +1,4 @@
# Copyright (C) 2025 Paul Retourné # Copyright (C) 2025-2026 Paul Retourné
# SPDX-License-Identifier: GPL-3.0-or-later # SPDX-License-Identifier: GPL-3.0-or-later
import subprocess import subprocess
@@ -7,6 +7,10 @@ import sys
import time import time
import logging import logging
from pathlib import Path
from unisync.errors import RemoteMountedError, InvalidMountError
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
class Synchroniser: class Synchroniser:
@@ -131,8 +135,66 @@ class Synchroniser:
command.append(remote_root) command.append(remote_root)
command.append("-batch") command.append("-batch")
print(command)
proc = subprocess.Popen(command) proc = subprocess.Popen(command)
ret_code = proc.wait() ret_code = proc.wait()
return ret_code return ret_code
def update_links(self, background:bool=True):
"""
Update the links on the remote.
First calls cleanlinks to remove deadlinks and empty directories.
Then calls lndir to create the new links.
Args:
- background: controls if the update is done in the background or waited for
"""
link_update_script = (f"cd {self.remote_dir}/links && "
"cleanlinks && "
"lndir -withrevinfo -ignorelinks -silent ../.data .;")
if background:
link_background_wrapper = f"nohup bash -c \"{link_update_script}\" > /dev/null 2>&1 < /dev/null &"
else:
link_background_wrapper = link_update_script
command = [
"/usr/bin/ssh",
"-S", self.control_path,
f"{self.remote_user}@{self.remote_ip}",
"-p", str(self.remote_port),
link_background_wrapper
]
link_update_process = subprocess.Popen(command, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
if not background:
print("Starting links update.")
link_update_process.wait()
print("Done")
def mount_remote_dir(self):
"""
Mount the remote directory to make the local links work.
This is achieved using sshfs.
Raise:
- RemoteMountedError: The .data directory is already a mount point
- InvalidMountError: .data is either not a directory or not empty
- subprocess.CalledProcessError: An error occured with sshfs
"""
# Get the absolute path to the correct .data directory resolving symlinks
path_to_mount:Path = Path(f"{self.local}/../.data").resolve()
if path_to_mount.is_mount():
raise RemoteMountedError
# Check if it is an empty directory
if not path_to_mount.is_dir() or any(path_to_mount.iterdir()):
raise InvalidMountError
command = [
"/usr/bin/sshfs",
"-o", "ControlPath={self.control_path}",
"-o", "ServerAliveInterval=15",
"-p", str(self.remote_port),
f"{self.remote_user}@{self.remote_ip}:{self.remote_dir}/.data",
str(path_to_mount)
]
completed_process = subprocess.run(command)
completed_process.check_returncode()