[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

Re: [atomic-devel] Discussion: How to keep image files in sync across repos

Hey guys,

We have been trying to solve a similar problem recently for our images [1]. Slavek (CCd) came up with distgen [2] as a result. The tool generates sources for images based on different operating systems from templates and maybe could help you with your use case as well. Or you could help us extend it.
Getting stuff synced across repos was on our TODO list, but we haven't got there yet, so we'd definitely be interested in collaboration.

-- Eliska

[1] https://github.com/container-images
[2] https://github.com/devexp-db/distgen/

On Tue, Oct 3, 2017 at 9:11 PM, Jason Brooks <jbrooks redhat com> wrote:
On Tue, Oct 3, 2017 at 11:52 AM, Stephen Milner <smilner redhat com> wrote:
> On Tue, Oct 3, 2017 at 2:28 PM, Dusty Mabe <dusty dustymabe com> wrote:
>> On 10/03/2017 09:58 AM, Stephen Milner wrote:
>>> In the last Atomic Community meeting I noted that keeping image files
>>> in sync across multiple repositories is going to become a support
>>> burden.
>>>   * ACTION: ashcrow jbrooks to start discussion on how to manage
>>>     multiple container repos  (jberkus, 16:19:09)
>>> As an example, the docker/container-engine image files reside in:
>>> - GitHub (CentOS and Fedora in different directories)
>>> - Fedora Repo (Fedora official builds)
>>> - RHEL Repo (RHEL builds)
>>> We consider the Fedora version in GitHub to be the upstream of all
>>> other image file sets. Each repo has it's own standards for inclusion
>>> which change the files slightly. Each also gets slight modifications
>>> over time by their repo owners for version specific bug fixes/standard
>>> updates. Add on that package names/configurations between the OS's may
>>> be different and it becomes clear that adding any bug fixes or
>>> enhancements to one file requires more work to keep files in sync than
>>> one would originally assume. Today we have just a small handful of
>>> these on our plate so we are able to manually keep the files in sync
>>> as we develop them. However, this won't scale as we continue to grow
>>> our images.
>>> Thinking quickly about the problem it seems like having a set of files
>>> upstream which are assembled into image files specific to their down
>>> stream repos would make a lot of sense. These could be generated
>>> either by the repo owners by using the upstream checkout and the
>>> assembly tool or generated by developers.
>> Does the mean the 'source of truth' for everything lives in this upstream
>> repo and then gets synced to downstream repos somehow? If so then I think
>> that is the right approach.
> That's how we've been thinking about it. Though in practice the source
> truth wanders to which ever repo owner finds and fixes the bug/adds
> the enhancement. Hopefully they, or someone else, alerts the other
> repo owners to merge in changes.
>>> (tl;dr)
>>> Questions:
>>> - Are there any _existing_tools_ that could help with keeping files
>>> across repos in sync? (aside from git + eyes)
>>> - Are there any _existing_tools_ which would allow us to
>>> generate/populate image files from components?
>>> - Are there any recommend processes already in place that we could
>>> adopt to simplify syncing?
>> what we essentially need is a bot that:
>> - monitors changes in a git repo
>> - maps files in one git repo to files in another git repo
>> - when upstream repo file changes open PR against downstream repo
> Interesting idea! It will also have to understand areas to not sync
> down. Maybe through some sort of comment annotations in the file (IE:
> when something is a CentOS workaround only, etc..).

Upstream we'll still need per-distro branches or folders anyway, since
we'll always have different FROM lines

> --
> Thanks,
> Steve Milner
> Atomic | Red Hat | http://projectatomic.io/

Eliska Slobodova
Associate Manager, Software Engineering

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]