[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <331930db-f5a1-4ad7-947f-7aaf5618c646@lunn.ch>
Date: Wed, 5 Jun 2024 14:51:51 +0200
From: Andrew Lunn <andrew@...n.ch>
To: Csókás, Bence <csokas.bence@...lan.hu>
Cc: netdev@...r.kernel.org, linux-kernel@...r.kernel.org,
trivial@...nel.org, Heiner Kallweit <hkallweit1@...il.com>,
Russell King <linux@...linux.org.uk>
Subject: Re: [RFC PATCH 1/2] net: include: mii: Refactor: Define LPA_* in
terms of ADVERTISE_*
On Wed, Jun 05, 2024 at 02:16:47PM +0200, Csókás, Bence wrote:
> Ethernet specification mandates that these bits will be equal.
> To reduce the amount of magix hex'es in the code, just define
> them in terms of each other.
Are magic hexes in this context actually bad? In .c files i would
agree. But what you have in effect done is force me into jump another
hoop to find the actual hex value so i can manually decode a register
value. And you have made the compile slightly slower.
These defines have been like this since the beginning of the git
history. Is there a good reason to change them after all that time?
Andrew
Powered by blists - more mailing lists