doubts about setPixels() in WritableRaster

M

mark jason

I was trying to get familiar with Raster,BufferedImage classes
etc..Encountered some doubts there..

1).
In Raster.getPixels() method why do we have to pass an optionally
predefined double array?

2).
I implemented a method which normalises the image data..However, it
doesn't do what I expected .For some reason ,the
BufferedImage.setData() doesn't give the expected result..

I have listed the output of this program below..original image data:
23.0 32.0 13.0 55.0 65.0 36.0 46.0 64.0 27.0 43.0 71.0 58.0 38.0 25.0
62.0 47.0 19.0 72.0 37.0 55.0
image data after normalisation:
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
1.0 0.0 0.0

I can't make out why these zeroes are getting printed..It should have
been
0.3194 0.4444 0.18055 0.7638....etc

If anyone can help me figure out where I have messed up ..please do.
thanks,
mark

<code>

import java.awt.image.BufferedImage;
import java.awt.image.ColorConvertOp;
import java.awt.image.DataBuffer;
import java.awt.image.Raster;
import java.awt.image.WritableRaster;
import java.io.File;
import java.io.IOException;
import javax.imageio.ImageIO;

public class MyImage {

private BufferedImage bi; //of type BufferedImage.TYPE_BYTE_GRAY
private String fileName;
public MyImage(String imageFileName) throws IOException{
this.fileName = imageFileName;
File imageFile = new File(imageFileName);
this.bi=makeGrayScaleImage(ImageIO.read(imageFile));
}
public MyImage(String imageName,int width,int height,double[] data){
if (data.length != width*height){
throw new IllegalArgumentException("data size must be equal to
"+width*height);
}
this.bi = new BufferedImage(width, height,
BufferedImage.TYPE_BYTE_GRAY);
this.bi.getRaster().setPixels(0, 0, width, height, data);
this.fileName = imageName;
}
private BufferedImage makeGrayScaleImage(BufferedImage img) {
BufferedImage gray = null;
try{
gray = new BufferedImage(img.getWidth(),img.getHeight(),
BufferedImage.TYPE_BYTE_GRAY);
ColorConvertOp ccop = new ColorConvertOp(
img.getColorModel().getColorSpace(),
gray.getColorModel().getColorSpace(),null);
ccop.filter(img,gray);

}catch(Exception e){
System.err.println("grayscale conversion failed");
}
return gray;
}
public int getWidth(){
return this.bi.getWidth();
}
public int getHeight(){
return this.bi.getHeight();
}
public double[] getData(){
int h = getHeight();
int w = getWidth();
double[] data = new double[h*w];
double[] pixeldata=this.bi.getData().getPixels(0,0,w,h,data);
return pixeldata;
}
public void normaliseImageData(){
double[] ndata = getNormalisedData();
WritableRaster wr = this.bi.getRaster();
wr.setPixels(0,0,this.getWidth(),this.getHeight(),ndata);
this.bi.setData(wr);
}
private double[] getNormalisedData(){
double[] d = getData();
double maxval = max(d);
for (int i = 0; i<d.length; i++){
d /= maxval;
}
return d;
}
private static double max(double[] arr){
double m=Double.MIN_VALUE;
for(int i = 0; i<arr.length; i++){
m = Math.max(m,arr);
}
return m;
}
private static void printArray(double[] a){
for (double x:a){
System.out.print(x+" ");
}
System.out.println();
}
private static void debug(String msg) {
System.out.println(msg);
}
public static void main(String[] args) {
MyImage my = new MyImage("dummy.png", 4, 5, new double[]
{23,32,13,55,65,36,46,64,27,43,71,58,38,25,62,47,19,72,37,55});
double[] data = my.getData();
debug("original image data:");
printArray(data);
my.normaliseImageData();
debug("image data after normalisation:");
double[] newdata = my.getData();
printArray(newdata);

}
}
</code>
 
M

Mayeul

I was trying to get familiar with Raster,BufferedImage classes
etc..Encountered some doubts there..

1).
In Raster.getPixels() method why do we have to pass an optionally
predefined double array?

Optional means you don't have to.

Why you can, I can only assume is for when you want to access the
image's pixels by chunks instead of all at once: would you rather
getPixels() always reallocate a new array, or just reuse the once array
that was allocated at the beginning?

2).
I implemented a method which normalises the image data..However, it
doesn't do what I expected .For some reason ,the
BufferedImage.setData() doesn't give the expected result..

I have listed the output of this program below..
original image data:
23.0 32.0 13.0 55.0 65.0 36.0 46.0 64.0 27.0 43.0 71.0 58.0 38.0 25.0
62.0 47.0 19.0 72.0 37.0 55.0
image data after normalisation:
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
1.0 0.0 0.0

I can't make out why these zeroes are getting printed..It should have
been
0.3194 0.4444 0.18055 0.7638....etc

I'm pretty sure that would be because your image is backed by a
DataBuffer of type DataBuffer.TYPE_INT, not DataBuffer.TYPE_DOUBLE

From what I tried, typical images you'd obtain from ImageIO are
(A)RGB-based images, using a DirectColorModel where each ARGB pixel is
stored as an int.
See how all your pixels are always whole values?

I didn't dive into using different primitive types to store images.
I expect you'll need the constructor of BufferedImage that takes a
ColorModel, and you'd need to define a ComponentColorModel with one
component, based on a gray scale ColorSpace.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,764
Messages
2,569,565
Members
45,041
Latest member
RomeoFarnh

Latest Threads

Top