Conversion from ARGB to RGBA

1

I'm trying to write my own Color struct for a task I have. My goal is that my buffer will always contain a RGBA value, even if it is initialized with ARGB values.

I have 2 constructors:

  1. Takes R,G,B,A values of type uint8_t separately - Works like a charm.
  2. Takes a uint32_t that holds an ARGB value - The problem begins here: I'm I have a method that converts the provided ARGB into RGBA ("fromArgb"). Seems like only the conversion to the RGBA color Red works well, and all other color conversions are invalid.

Examples:

  • Color red = Color(0xFFFF0000); // Works well (Contains values: Hex: #FF0000FF; R: 255, G: 0, B: 0, A: 255)

  • Color green = Color(0xFF008000); // Wrong - Actually Pinkish color (Contains values: Hex: #008000FF R: 255, G: 0, B: 128, A: 0)

  • Color blue = Color(0xFF0000FF); // Wrong - Actually Yellowish color (Contains values: Hex: #0000FFFF R: 255, G: 255, B: 0, A: 0)

  • Color yellow = Color(0xFFFFFF00); // Wrong - Actually Pinkish color (Contains values: Hex: #FFFF00FF R: 255, G: 0, B: 255, A: 255)

I can't seem to find the problem. I'll be more than glad to have some support from the community!

Example source Code:

struct Color
{
        public: 
        /* Works fine!!! */
        Color(uint8_t r, uint8_t g, uint8_t b, uint8_t a = 255)
        {
          buffer((r << 0) | (g << 8) | (b << 16) | (a << 24))
        } 

        Color(const uint32_t argb)
        {
          buffer = fromArgb(argb);
        }


        inline uint32_t fromArgb(uint32_t argb)
        {
            return
                // Source is in format: 0xAARRGGBB
                ((argb & 0x00FF0000) << 8)  | //RR______
                ((argb & 0x0000FF00) << 8)  | //__GG____
                ((argb & 0x000000FF) << 8)  | //____BB__
                ((argb & 0xFF000000) >> 24);  //______AA
                // Return value is in format:  0xRRGGBBAA
        }

        inline uint8_t getRed(void) const
        {
            return (buffer >> 0) & 0xFF;
        }

        inline uint8_t getGreen(void) const
        {
            return (buffer >> 8) & 0xFF;
        }

        inline uint8_t getBlue(void) const
        {
            return (buffer >> 16) & 0xFF;
        }

        inline uint8_t getAlpha(void) const
        {
            return (buffer >> 24) & 0xFF;
        }

        /* Works fine!!!*/
        std::string getHex(void) const
        {   
            std::string result    = "#";
            char colorBuffer[255] = {};

            // Order is intentionally end to beginning
            sprintf_s(colorBuffer, 255, "%.2X", getAlpha());
            result.append(colorBuffer);

            sprintf_s(colorBuffer, 255, "%.2X", getBlue());
            result.append(colorBuffer);

            sprintf_s(colorBuffer, 255, "%.2X", getGreen());
            result.append(colorBuffer);

            sprintf_s(colorBuffer, 255, "%.2X", getRed());
            result.append(colorBuffer);

            return result;
        }

        private:
         uint32_t buffer;   
    }
c++
colors
bit-shift
rgba
argb
asked on Stack Overflow May 6, 2020 by Ron Nuni

1 Answer

2

Looks to me that the class is holding an ABGR value, so obviously a conversion from ARGB to RGBA isn't helpful. This seems right (untested though).

    inline uint32_t fromArgb(uint32_t argb)
    {
        return
            // Source is in format: 0xAARRGGBB
            ((argb & 0x00FF0000) >> 16)  | //______RR
            ((argb & 0x0000FF00))        | //____GG__
            ((argb & 0x000000FF) << 16)  | /___BB____
            ((argb & 0xFF000000));         //AA______
            // Return value is in format:  0xAABBGGRR 
    }
answered on Stack Overflow May 6, 2020 by john

User contributions licensed under CC BY-SA 3.0