Re: [PATCH] drm/ssd130x: Change pixel format used to compute the buffer size

From: Thomas Zimmermann
Date: Thu Jul 13 2023 - 06:07:56 EST




Am 13.07.23 um 10:58 schrieb Javier Martinez Canillas:
The commit e254b584dbc0 ("drm/ssd130x: Remove hardcoded bits-per-pixel in
ssd130x_buf_alloc()") used a pixel format info instead of a hardcoded bpp
to calculate the size of the buffer allocated to store the native pixels.

But that wrongly used the DRM_FORMAT_C1 fourcc pixel format, which is for
color-indexed frame buffer formats. While the ssd103x controllers don't
support different single-channel colors nor a Color Lookup Table (CLUT).

Makes sense to me.

Reviewed-by: Thomas Zimmermann <tzimmermann@xxxxxxx>


Both formats use eight pixels/byte, so in practice there is no functional
changes in this patch. But still the correct pixel format should be used.

Suggested-by: Geert Uytterhoeven <geert@xxxxxxxxxxxxxx>
Signed-off-by: Javier Martinez Canillas <javierm@xxxxxxxxxx>
---

drivers/gpu/drm/solomon/ssd130x.c | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/drivers/gpu/drm/solomon/ssd130x.c b/drivers/gpu/drm/solomon/ssd130x.c
index b3dc1ca9dc10..afb08a8aa9fc 100644
--- a/drivers/gpu/drm/solomon/ssd130x.c
+++ b/drivers/gpu/drm/solomon/ssd130x.c
@@ -153,7 +153,7 @@ static int ssd130x_buf_alloc(struct ssd130x_device *ssd130x)
const struct drm_format_info *fi;
unsigned int pitch;
- fi = drm_format_info(DRM_FORMAT_C1);
+ fi = drm_format_info(DRM_FORMAT_R1);
if (!fi)
return -EINVAL;

--
Thomas Zimmermann
Graphics Driver Developer
SUSE Software Solutions Germany GmbH
Frankenstrasse 146, 90461 Nuernberg, Germany
GF: Ivo Totev, Andrew Myers, Andrew McDonald, Boudien Moerman
HRB 36809 (AG Nuernberg)

Attachment: OpenPGP_signature
Description: OpenPGP digital signature