[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <CAEf4BzbUEWp+TBjRXaL2XN8GwKYMJPO+PpRJ0uqgh2kOXKvBzg@mail.gmail.com>
Date: Thu, 30 Jul 2020 12:43:11 -0700
From: Andrii Nakryiko <andrii.nakryiko@...il.com>
To: Jakub Sitnicki <jakub@...udflare.com>
Cc: bpf <bpf@...r.kernel.org>, Networking <netdev@...r.kernel.org>,
kernel-team <kernel-team@...udflare.com>,
Alexei Starovoitov <ast@...nel.org>,
Daniel Borkmann <daniel@...earbox.net>,
"David S. Miller" <davem@...emloft.net>,
Jakub Kicinski <kuba@...nel.org>
Subject: Re: [PATCH bpf-next v5 15/15] selftests/bpf: Tests for BPF_SK_LOOKUP
attach point
On Thu, Jul 30, 2020 at 6:10 AM Jakub Sitnicki <jakub@...udflare.com> wrote:
>
> On Wed, Jul 29, 2020 at 10:57 AM CEST, Jakub Sitnicki wrote:
> > On Tue, Jul 28, 2020 at 10:13 PM CEST, Andrii Nakryiko wrote:
> >
> > [...]
> >
> >> We are getting this failure in Travis CI when syncing libbpf [0]:
> >>
> >> ```
> >> ip: either "local" is duplicate, or "nodad" is garbage
> >>
> >> switch_netns:PASS:unshare 0 nsec
> >>
> >> switch_netns:FAIL:system failed
> >>
> >> (/home/travis/build/libbpf/libbpf/travis-ci/vmtest/bpf-next/tools/testing/selftests/bpf/prog_tests/sk_lookup.c:1310:
> >> errno: No such file or directory) system(ip -6 addr add dev lo
> >> fd00::1/128 nodad)
> >>
> >> #73 sk_lookup:FAIL
> >> ```
> >>
> >>
> >> Can you please help fix it so that it works in a Travis CI environment
> >> as well? For now I disabled sk_lookup selftests altogether. You can
> >> try to repro it locally by forking https://github.com/libbpf/libbpf
> >> and enabling Travis CI for your account. See [1] for the PR that
> >> disabled sk_lookup.
>
> [...]
>
> Once this fix-up finds its way to bpf-next, we will be able to re-enable
> sk_loookup tests:
>
> https://lore.kernel.org/bpf/20200730125325.1869363-1-jakub@cloudflare.com/
>
> And I now know that I need to test shell commands against BusyBox 'ip'
> command implementation, that libbpf project uses in CI env.
Thanks! I still see some (other) failures, it might be that our
environment is not full enough or something (you also mentioned some
other fix to Daniel, that might help as well, dunno). But your fix is
good nevertheless.
Powered by blists - more mailing lists