Musl Distros - GNU/Linux
z3bra
For now, I decided to focus on add/delete/list only, as they are the most important actions.
Updates will definitely be handled though, and in a simple way. Here is what I have in mind:

if package version is different in the tarball name compared to what's installed, then:

0. remove file that are not in the tarball anymore
1. unpack files from the tarball, overwriting old file
2. update the "version" file in the metadata directory

For the two first step, the only condition would be that creation time == last modification time.
If the user modified it by hand, it means we shouldn't remove it, or replace. If it's the case, then two actions will be taken:

0. warn the user that the file shouldn't exist anymore and will not be removed in the future
1. write the new file as ${filename}.new and report it to the user

I can't think of why this process would not cover all cases :)
apk
Very cool indeed z3bra. Like I said, I'm really intrigued by your package manager, but I have to ask, would that process only be possible if the tarball(s) in question were already on the filesystem? Or does this mechanism work with the download location of the tarball? I've noticed that projects TEND to have predictable tarball locations (IE: most GNU software uses the same FTP/HTTP hierarchy), but this isn't true for a lot of software.

Honestly I'm really only honing in on this specific feature because a few years ago I attempted the same thing you have and I failed miserably.
z3bra
pm(1) only takes local files as arguments, it is not meant to deal with external repos of some kind.
I have written two small shell script, namely repo(1) and repogen(1) to deal with remote tarball fetching or whatever. I've listed some examples in pm's README, but for all the lazy asses in there (I know there's a lot of you!), here is an idea of how it works.

Let's say you have two machines, "server" and "client". On the server, you'll have the build system of your choice, having all tarballs created under /var/www/dl.z3bra.org/releases. You'd use repogen(1) to turn this directory into a repository (it simply generate a .list file, with the name, version and checksum of all tarballs):

Code:
[root@server ~]# repogen /var/www/dl.z3bra.org/releases
bzip2   1.0.6   2df1fe882e46a1e614d0fa2c72590c2c0354923d
curl    7.42.1  be76b8b93528156fe36049ee92dd64001607b263
dash    0.5.8   3a4833d32c1c633c11bd048d6efd89cc0b7620d0
fs      0.1     3ffeef271370b629ed6a2b93023c0372fd176680
gzip    1.3.3   b82268b6ad4d86db2d11aa28352b8c20cd40a994
iproute2        3.18.0  c47b14dfda5617e3a79d491ad9e9e109c2a81ebc
mk      0.1     dd7bbbe7d7cd0f69fc44d606b94d3ef574cf3693
mksh    R50e    01119230f24f488777cff2cc46cc5aae1aecaa6a
musl    1.1.12  eaa54d777dd5d341378c1d3742b30f532e4ac653
pcc-libs        1.1.0   ada38ca8ee75e734663a0be75927df1303954c81
pcc     1.1.0   9a1b3f7fd8cad795026cf98415165dc8756d296c
sbase   0.0     c7a78f26f9d3ee92c1d43d3078e12f54c9cda6e7
ubase   0.0     73b4e3ee83b78b0ddcf14672713f3e6d11311eae

Now that you have your remote repo setup, you can sync it locally using the repo(1) utility:

Code:
[z3bra@client ~]$ export REPO=$HOME/repo
[z3bra@client ~]$ repo -s
bzip2   1.0.6   2df1fe882e46a1e614d0fa2c72590c2c0354923d
curl    7.42.1  be76b8b93528156fe36049ee92dd64001607b263
dash    0.5.8   3a4833d32c1c633c11bd048d6efd89cc0b7620d0
fs      0.1     3ffeef271370b629ed6a2b93023c0372fd176680
gzip    1.3.3   b82268b6ad4d86db2d11aa28352b8c20cd40a994
iproute2        3.18.0  c47b14dfda5617e3a79d491ad9e9e109c2a81ebc
mk      0.1     dd7bbbe7d7cd0f69fc44d606b94d3ef574cf3693
mksh    R50e    01119230f24f488777cff2cc46cc5aae1aecaa6a
musl    1.1.12  eaa54d777dd5d341378c1d3742b30f532e4ac653
pcc-libs        1.1.0   ada38ca8ee75e734663a0be75927df1303954c81
pcc     1.1.0   9a1b3f7fd8cad795026cf98415165dc8756d296c
sbase   0.0     c7a78f26f9d3ee92c1d43d3078e12f54c9cda6e7
ubase   0.0     73b4e3ee83b78b0ddcf14672713f3e6d11311eae

Syncing the repository will NOT download all the tarballs. That would be a huge waste of time. You can list the content of your local repo with "repo -l". This will give the same output as when you sync the repo, minus the checksums.
To actually download a tarball, simply run repo(1) with the name of the pack you want. Once done, the full path to the tarball will be printed to stdout:

Code:
[z3bra@client ~]$ repo dash
/home/z3bra/repo/dash:0.5.8.tar.bz2
[z3bra@client ~] tree $REPO
/home/z3bra/repo
├── dash:0.5.8.tar.bz2
└── .list

If you call the command again, repo will check the tarball's checksum against its local .list file, and if they match, will output the full path again to stdout, without redownloading the file (this makes sense right?)
You can then use repo's output to feed pm(1) without having to know where the tarballs are located:

Code:
[z3bra@client ~]$ pm -a $(repo dash)

is equivalent to:

Code:
[z3bra@client ~]$ repo dash
/home/z3bra/repo/dash:0.5.8.tar.bz2
[z3bra@client ~]$ pm -a /home/z3bra/repo/dash:0.5.8.tar.bz2

At some point, I'll try to make pm read path from stdin, leading to a simpler way to install tarballs ;)
Repo can already read pack names from stdin, leading to some neat one liners! For example, to download all packs except the libraries, you can run the following:

Code:
[z3bra@client ~]$ repo -s | grep -v 'lib' | cut -f1 | repo
/home/z3bra/repo/bzip2:1.0.6.tar.bz2
/home/z3bra/repo/curl:7.42.1.tar.bz2
/home/z3bra/repo/dash:0.5.8.tar.bz2
/home/z3bra/repo/fs:0.1.tar.bz2
/home/z3bra/repo/gzip:1.3.3.tar.bz2
/home/z3bra/repo/iproute2:3.18.0.tar.bz2
/home/z3bra/repo/mk:0.1.tar.bz2
/home/z3bra/repo/mksh:R50e.tar.bz2
/home/z3bra/repo/musl:1.1.12.tar.bz2
/home/z3bra/repo/pcc:1.1.0.tar.bz2
/home/z3bra/repo/sbase:0.0.tar.bz2
/home/z3bra/repo/ubase:0.0.tar.bz2

Note that this is still a work in progress, which is why I didn't commit neither repo(1) nor repogen(1). I'm still trying to figure out how they should work, and what the workflow should be.
If you're interrested in giving it a shot, and reporting some points that might be awkward or incoherent, just ask and I'll paste the scripts ;)
venam
(15-01-2016, 08:51 AM)z3bra Wrote: pm(1) only takes local files as arguments, it is not meant to deal with external repos of some kind.
I have written two small shell script, namely repo(1) and repogen(1) to deal with remote tarball fetching or whatever.
Seems like a great program to do something like packages sharing between test machines or work machines, for example on a local network box.
It's easier to setup than any other package manager syncing repo and offers flexibility to choose what is shared.
XcelQ
I love adam distro already if he get it working. that my dream musl based os.
z3bra
A little update for this thread. My pack manager seems to be working now, in a sane, reliable and repeatable way. I am now focusing on making the distro a standalone environment.
I want to be able to bootstrap my distro from within it. The real missing part here is gcc. That bitch is a hell to get right.
I've dropped my "in-house" cross-compiler for something more professional and reliable: crosstools-ng. It provides a toolchain that is more solid than my hacky solution, but also less flexible (my toolchain could ONLY compile static binaries, and didn't make use of any directory having a /usr prefix).
So I'll need to investigate it a bit further. Still, I've been able to use it with most of my ports, so it's a nice step forward.

If anyone is interrested in giving it a try, I can help!
venam
(09-02-2016, 07:56 AM)z3bra Wrote: If anyone is interrested in giving it a try, I can help!
I did give it a try!
The whole setup process was quite tedious.
pm is amazingly simple and fits well the static compilation.
I didn't dig too much into how mk works to help with gcc compilation, I'll try to focus on that next time.
It's all more complex than I thought.
z3bra
It is indeed complex; but it's also a huge mine of knowledge! This is what kept me hooked from the start.
z3bra
Quick update! I'm finally running my distro in a chroot, with all packs compiled from within this chroot (including gcc and binutils!). A few packs are still resisting the compilation process (m4 and diffutils for example) but overall, it works quite well!

I will hopefully boot it from real hardware soon, so I can start working on a better init and service management programs!
jkl
2017 update:
Note that midipix, the native "musl for Windows", is already shaping up nicely:
http://midipix.org




Members  |  Stats  |  Night Mode