Loading core/java/android/hardware/DisplayLuts.java +16 −6 Original line number Diff line number Diff line Loading @@ -64,13 +64,23 @@ public final class DisplayLuts { /** * Create a Lut entry. * * <p> * 1D Lut(s) are treated as gain curves. * 3D Lut(s) are used for direct color manipulations. * The values of 3D Lut(s) data should be normalized to the range {@code 0.0} * <p> 1D Lut(s) are treated as gain curves. </p> * <p> 3D Lut(s) are used for direct color manipulations. </p> * <p> For 3D Lut(s), the values should be normalized to the range {@code 0.0} * to {@code 1.0}, inclusive. And {@code 1.0} is the maximum panel luminance. * And 3D Lut(s) data is organized in RGB order * (R0, R1, R2, ..., RN, G0, G1, G2, ..., GN, B0, B1, B2, ..., BN) if N is the dimension. * And If N is the size of each dimension, the data is arranged in RGB order: * <pre> * R(0, 0, 0), R(0, 0, 1), ..., R(0, 0, N - 1), * R(0, 1, 0), ..., R(0, 1, N - 1), ..., R(0, N - 1, N - 1), * R(1, 0, 0), ..., R(1, 0, N - 1), ..., R(1, N - 1, N - 1), ..., R(N - 1, N - 1, N - 1), * G(0, 0, 0), ..., G(N - 1, N - 1, N - 1), * B(0, 0, 0), ..., B(N - 1, N - 1, N - 1)</pre> * When a GPU shader samples 3D Lut data, it's accessed in a flat, * one-dimensional arrangement. Assuming that we have a 3D array * {@code ORIGINAL[N][N][N]}, then * <pre> * FLAT[z + N * (y + N * x)] = ORIGINAL[x][y][z]</pre> * </p> * * @param buffer The raw lut data * @param dimension Either 1D or 3D Loading Loading
core/java/android/hardware/DisplayLuts.java +16 −6 Original line number Diff line number Diff line Loading @@ -64,13 +64,23 @@ public final class DisplayLuts { /** * Create a Lut entry. * * <p> * 1D Lut(s) are treated as gain curves. * 3D Lut(s) are used for direct color manipulations. * The values of 3D Lut(s) data should be normalized to the range {@code 0.0} * <p> 1D Lut(s) are treated as gain curves. </p> * <p> 3D Lut(s) are used for direct color manipulations. </p> * <p> For 3D Lut(s), the values should be normalized to the range {@code 0.0} * to {@code 1.0}, inclusive. And {@code 1.0} is the maximum panel luminance. * And 3D Lut(s) data is organized in RGB order * (R0, R1, R2, ..., RN, G0, G1, G2, ..., GN, B0, B1, B2, ..., BN) if N is the dimension. * And If N is the size of each dimension, the data is arranged in RGB order: * <pre> * R(0, 0, 0), R(0, 0, 1), ..., R(0, 0, N - 1), * R(0, 1, 0), ..., R(0, 1, N - 1), ..., R(0, N - 1, N - 1), * R(1, 0, 0), ..., R(1, 0, N - 1), ..., R(1, N - 1, N - 1), ..., R(N - 1, N - 1, N - 1), * G(0, 0, 0), ..., G(N - 1, N - 1, N - 1), * B(0, 0, 0), ..., B(N - 1, N - 1, N - 1)</pre> * When a GPU shader samples 3D Lut data, it's accessed in a flat, * one-dimensional arrangement. Assuming that we have a 3D array * {@code ORIGINAL[N][N][N]}, then * <pre> * FLAT[z + N * (y + N * x)] = ORIGINAL[x][y][z]</pre> * </p> * * @param buffer The raw lut data * @param dimension Either 1D or 3D Loading