• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java Picture类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.jcodec.common.model.Picture的典型用法代码示例。如果您正苦于以下问题:Java Picture类的具体用法?Java Picture怎么用?Java Picture使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



Picture类属于org.jcodec.common.model包,在下文中一共展示了Picture类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: encodeNativeFrame

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public void encodeNativeFrame(Picture pic) throws IOException {
    if (toEncode == null) {
        toEncode = Picture.create(pic.getWidth() , pic.getHeight() , encoder.getSupportedColorSpaces()[0]);
    }

    // Perform conversion
    try {
        transform.transform(pic, toEncode);
    }catch (Exception e){
        return;
    }
    // Encode image into H.264 frame, the result is stored in '_out' buffer
    _out.clear();
    ByteBuffer result = encoder.encodeFrame(toEncode, _out);

    // Based on the frame above form correct MP4 packet
    spsList.clear();
    ppsList.clear();
    H264Utils.wipePS(result, spsList, ppsList);
    H264Utils.encodeMOVPacket(result);

    // Add packet to video track
    outTrack.addFrame(new MP4Packet(result, frameNo, 5, 1, frameNo, true, null, frameNo, 0));

    frameNo++;
}
 
开发者ID:hiliving,项目名称:P2Video-master,代码行数:27,代码来源:SequenceEncoderMp4.java


示例2: transcode

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public List<ByteBuffer> transcode() throws IOException {
    H264Decoder decoder = new H264Decoder();
    decoder.addSps(avcC.getSpsList());
    decoder.addPps(avcC.getPpsList());
    Picture buf = Picture.create(mbW << 4, mbH << 4, ColorSpace.YUV420);
    Frame dec = null;
    for (VirtualPacket virtualPacket : head) {
        dec = decoder.decodeFrame(H264Utils.splitMOVPacket(virtualPacket.getData(), avcC), buf.getData());
    }
    H264Encoder encoder = new H264Encoder(rc);
    ByteBuffer tmp = ByteBuffer.allocate(frameSize);

    List<ByteBuffer> result = new ArrayList<ByteBuffer>();
    for (VirtualPacket pkt : tail) {
        dec = decoder.decodeFrame(H264Utils.splitMOVPacket(pkt.getData(), avcC), buf.getData());

        tmp.clear();
        ByteBuffer res = encoder.encodeFrame(dec, tmp);
        ByteBuffer out = ByteBuffer.allocate(frameSize);
        processFrame(res, out);

        result.add(out);
    }

    return result;
}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:27,代码来源:AVCClipTrack.java


示例3: encodeNativeFrame

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public void encodeNativeFrame(Picture pic) throws IOException {
    if (toEncode == null) {
        toEncode = Picture.create(pic.getWidth() , pic.getHeight() , encoder.getSupportedColorSpaces()[0]);
    }

    // Perform conversion
    transform.transform(pic, toEncode);

    // Encode image into H.264 frame, the result is stored in '_out' buffer
    _out.clear();
    ByteBuffer result = encoder.encodeFrame(toEncode, _out);

    // Based on the frame above form correct MP4 packet
    spsList.clear();
    ppsList.clear();
    H264Utils.wipePS(result, spsList, ppsList);
    H264Utils.encodeMOVPacket(result);

    // Add packet to video track
    outTrack.addFrame(new MP4Packet(result, frameNo, timeScale, 1, frameNo, true, null, frameNo, 0));

    frameNo++;
}
 
开发者ID:ynztlxdeai,项目名称:ImageToVideo,代码行数:24,代码来源:SequenceEncoderMp4.java


示例4: toColorArray

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public static int[] toColorArray(Picture src){
	if (src.getColor() != ColorSpace.RGB) {
           Transform transform = ColorUtil.getTransform(src.getColor(), ColorSpace.RGB);
           Picture rgb = Picture.create(src.getWidth(), src.getHeight(), ColorSpace.RGB, src.getCrop());
           transform.transform(src, rgb);
           src = rgb;
       }
	
	int[] _return = new int[src.getCroppedWidth() * src.getCroppedHeight()];
	
	int[] data = src.getPlaneData(0);
	
	for(int i = 0; i < _return.length; ++i){
		_return[i] = ReadableRGBContainer.toIntColor(data[3*i + 2], data[3*i + 1], data[3*i]);
	}
	
	return _return;
}
 
开发者ID:vitrivr,项目名称:cineast,代码行数:19,代码来源:PictureUtil.java


示例5: open

import org.jcodec.common.model.Picture; //导入依赖的package包/类
@Override
public void open(String _path, int width, int _height, int _fps) throws IOException {
    path = _path;
    height = _height;
    fps = _fps;

    ch = new FileChannelWrapper(FileChannel.open(Paths.get(path), StandardOpenOption.CREATE, StandardOpenOption.WRITE, StandardOpenOption.TRUNCATE_EXISTING));
    // Muxer that will store the encoded frames
    muxer = new MP4Muxer(ch, Brand.MP4);
    // Add video track to muxer
    outTrack = muxer.addTrack(TrackType.VIDEO, fps);
    // Allocate a buffer big enough to hold output frames
    _out = ByteBuffer.allocateDirect(width * height * 6);
    // Create an instance of encoder
    encoder = new H264Encoder(new JCodecUtils.JHVRateControl(20));
    // Encoder extra data ( SPS, PPS ) to be stored in a special place of MP4
    spsList = new ArrayList<>();
    ppsList = new ArrayList<>();
    toEncode = Picture.create(width, height, ColorSpace.YUV420J);
}
 
开发者ID:Helioviewer-Project,项目名称:JHelioviewer-SWHV,代码行数:21,代码来源:JCodecExporter.java


示例6: makeFrame

import org.jcodec.common.model.Picture; //导入依赖的package包/类
private Picture makeFrame(BufferedImage bi) {
    DataBuffer imageData = bi.getRaster().getDataBuffer();
    int[] yPixel = new int[imageData.getSize()];
    int[] uPixel = new int[imageData.getSize() >> 2];
    int[] vPixel = new int[imageData.getSize() >> 2];
    int ipx = 0, uvOffset = 0;

    for (int h = 0; h < bi.getHeight(); h++) {
        for (int w = 0; w < bi.getWidth(); w++) {
            int elem = imageData.getElem(ipx);
            int r = 0x0ff & (elem >>> 16);
            int g = 0x0ff & (elem >>> 8);
            int b = 0x0ff & elem;
            yPixel[ipx] = ((66 * r + 129 * g + 25 * b) >> 8) + 16;
            if ((0 != w % 2) && (0 != h % 2)) {
                uPixel[uvOffset] = ((-38 * r + -74 * g + 112 * b) >> 8) + 128;
                vPixel[uvOffset] = ((112 * r + -94 * g + -18 * b) >> 8) + 128;
                uvOffset++;
            }
            ipx++;
        }
    }
    int[][] pix = {yPixel, uPixel, vPixel, null};
    return new Picture(bi.getWidth(), bi.getHeight(), pix, ColorSpace.YUV420);
}
 
开发者ID:kamil-karkus,项目名称:EasySnap,代码行数:26,代码来源:Encoder.java


示例7: encodeImage

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public void encodeImage(BufferedImage bi) throws IOException {
    if (toEncode == null) {
        toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420);
    }

    // Perform conversion
    for (int i = 0; i < 3; i++) {
        Arrays.fill(toEncode.getData()[i], 0);
    }
    transform.transform(AWTUtil.fromBufferedImage(bi), toEncode);

    // Encode image into H.264 frame, the result is stored in '_out' buffer
    _out.clear();
    ByteBuffer result = encoder.encodeFrame(_out, toEncode);

    // Based on the frame above form correct MP4 packet
    spsList.clear();
    ppsList.clear();
    H264Utils.encodeMOVPacket(result, spsList, ppsList);

    // Add packet to video track
    outTrack.addFrame(new MP4Packet(result, frameNo, 25, 1, frameNo, true, null, frameNo, 0));

    frameNo++;
}
 
开发者ID:deepakpk009,项目名称:JScreenRecorder,代码行数:26,代码来源:SequenceEncoder.java


示例8: transcode

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public ByteBuffer transcode(ByteBuffer in, ByteBuffer _out) {
    ByteBuffer out = _out.slice();
    int width = (sh.horizontal_size + 15) & ~0xf;
    int height = (sh.vertical_size + 15) & ~0xf;

    int[][] buffer = new int[][] { new int[width * height], new int[width * height], new int[width * height],
            new int[(width >> 4) * (height >> 4)] };
    Picture dct = decodeFrame(in, buffer);

    Picture[] pic = convert(dct);

    if (pic.length == 1)
        dct2Prores.encodeFrame(out, pic[0]);
    else
        dct2Prores.encodeFrame(out, pic[0], pic[1]);
    out.flip();
    return out;
}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:19,代码来源:Mpeg2Prores.java


示例9: decodeScan

import org.jcodec.common.model.Picture; //导入依赖的package包/类
private Picture decodeScan(ByteBuffer data, FrameHeader header, ScanHeader scan, VLC[] huffTables, int[][] quant,
        int[][] data2, int field, int step) {
    int blockW = header.getHmax();
    int blockH = header.getVmax();
    int mcuW = blockW << 3;
    int mcuH = blockH << 3;

    int width = header.width;
    int height = header.height;

    int xBlocks = (width + mcuW - 1) >> (blockW + 2);
    int yBlocks = (height + mcuH - 1) >> (blockH + 2);

    int nn = blockW + blockH;
    Picture result = new Picture(xBlocks << (blockW + 2), yBlocks << (blockH + 2), data2,
            nn == 4 ? ColorSpace.YUV420J : (nn == 3 ? ColorSpace.YUV422J : ColorSpace.YUV444J), new Rect(0, 0,
                    width, height));

    BitReader bits = new BitReader(data);
    int[] dcPredictor = new int[] { 1024, 1024, 1024 };
    for (int by = 0; by < yBlocks; by++)
        for (int bx = 0; bx < xBlocks && bits.moreData(); bx++)
            decodeMCU(bits, dcPredictor, quant, huffTables, result, bx, by, blockW, blockH, field, step);

    return result;
}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:27,代码来源:JpegDecoder.java


示例10: yuv42210BitTObgr24

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public void yuv42210BitTObgr24(Picture yuv, ByteBuffer rgb32) {
    IntBuffer y = IntBuffer.wrap(yuv.getPlaneData(0));
    IntBuffer cb = IntBuffer.wrap(yuv.getPlaneData(1));
    IntBuffer cr = IntBuffer.wrap(yuv.getPlaneData(2));

    while (y.hasRemaining()) {
        int c1 = y.get() - 64;
        int c2 = y.get() - 64;
        int d = cb.get() - 512;
        int e = cr.get() - 512;

        rgb32.put(blue(d, c1));
        rgb32.put(green(d, e, c1));
        rgb32.put(red(e, c1));

        rgb32.put(blue(d, c2));
        rgb32.put(green(d, e, c2));
        rgb32.put(red(e, c2));
    }
}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:21,代码来源:CVTColorFilter.java


示例11: transform

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public void transform(Picture src, Picture dst) {
    int lumaSize = src.getWidth() * src.getHeight();
    System.arraycopy(src.getPlaneData(0), 0, dst.getPlaneData(0), 0, lumaSize);
    copyAvg(src.getPlaneData(1), dst.getPlaneData(1), src.getPlaneWidth(1), src.getPlaneHeight(1));
    copyAvg(src.getPlaneData(2), dst.getPlaneData(2), src.getPlaneWidth(2), src.getPlaneHeight(2));

    if (shiftUp > shiftDown) {
        up(dst.getPlaneData(0), shiftUp - shiftDown);
        up(dst.getPlaneData(1), shiftUp - shiftDown);
        up(dst.getPlaneData(2), shiftUp - shiftDown);
    } else if (shiftDown > shiftUp) {
        down(dst.getPlaneData(0), shiftDown - shiftUp);
        down(dst.getPlaneData(1), shiftDown - shiftUp);
        down(dst.getPlaneData(2), shiftDown - shiftUp);
    }
}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:17,代码来源:Yuv444pToYuv420p.java


示例12: test

import org.jcodec.common.model.Picture; //导入依赖的package包/类
private boolean test(File coded, File ref) throws IOException {
    MappedH264ES es = new MappedH264ES(NIOUtils.fetchFrom(coded));
    Picture buf = Picture.create(1920, 1088, ColorSpace.YUV420);
    H264Decoder dec = new H264Decoder();
    Packet nextFrame;
    ByteBuffer _yuv = NIOUtils.fetchFrom(ref);
    while ((nextFrame = es.nextFrame()) != null) {
        Picture out = dec.decodeFrame(nextFrame.getData(), buf.getData()).cropped();
        Picture pic = out.createCompatible();
        pic.copyFrom(out);
        int lumaSize = pic.getWidth() * pic.getHeight();
        int crSize = lumaSize >> 2;
        int cbSize = lumaSize >> 2;

        ByteBuffer yuv = NIOUtils.read(_yuv, lumaSize + crSize + cbSize);

        if (!Arrays.equals(getAsIntArray(yuv, lumaSize), pic.getPlaneData(0)))
            return false;
        if (!Arrays.equals(getAsIntArray(yuv, crSize), pic.getPlaneData(1)))
            return false;
        if (!Arrays.equals(getAsIntArray(yuv, cbSize), pic.getPlaneData(2)))
            return false;
    }
    return true;
}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:26,代码来源:VerifyTool.java


示例13: put

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public void put(Picture tgt, Picture decoded, int mbX, int mbY) {

        int[] luma = tgt.getPlaneData(0);
        int stride = tgt.getPlaneWidth(0);

        int[] cb = tgt.getPlaneData(1);
        int[] cr = tgt.getPlaneData(2);
        int strideChroma = tgt.getPlaneWidth(1);

        int dOff = 0;
        for (int i = 0; i < 16; i++) {
            System.arraycopy(decoded.getPlaneData(0), dOff, luma, (mbY * 16 + i) * stride + mbX * 16, 16);
            dOff += 16;
        }
        for (int i = 0; i < 8; i++) {
            System.arraycopy(decoded.getPlaneData(1), i * 8, cb, (mbY * 8 + i) * strideChroma + mbX * 8, 8);
        }
        for (int i = 0; i < 8; i++) {
            System.arraycopy(decoded.getPlaneData(2), i * 8, cr, (mbY * 8 + i) * strideChroma + mbX * 8, 8);
        }
    }
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:22,代码来源:SliceDecoder.java


示例14: runJMCompareResults

import org.jcodec.common.model.Picture; //导入依赖的package包/类
private void runJMCompareResults(List<Picture> decodedPics, int seqNo) throws Exception {

        try {
            Process process = Runtime.getRuntime().exec(jm + " -d " + jmconf.getAbsolutePath());
            process.waitFor();

            ByteBuffer yuv = NIOUtils.fetchFrom(decoded);
            for (Picture pic : decodedPics) {
                pic = pic.cropped();
                boolean equals = Arrays.equals(getAsIntArray(yuv, pic.getPlaneWidth(0) * pic.getPlaneHeight(0)),
                        pic.getPlaneData(0));
                equals &= Arrays.equals(getAsIntArray(yuv, pic.getPlaneWidth(1) * pic.getPlaneHeight(1)),
                        pic.getPlaneData(1));
                equals &= Arrays.equals(getAsIntArray(yuv, pic.getPlaneWidth(2) * pic.getPlaneHeight(2)),
                        pic.getPlaneData(2));
                if (!equals)
                    diff(seqNo);
            }
        } catch (Exception e) {
            diff(seqNo);
        }
    }
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:23,代码来源:TestTool.java


示例15: transform

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public void transform(Picture src, Picture dst) {
    int[] y = src.getPlaneData(0);
    int[] u = src.getPlaneData(1);
    int[] v = src.getPlaneData(2);

    int[] data = dst.getPlaneData(0);

    int offLuma = 0, offChroma = 0;
    for (int i = 0; i < dst.getHeight(); i++) {
        for (int j = 0; j < dst.getWidth(); j += 2) {
            YUVJtoRGB(y[offLuma], u[offChroma], v[offChroma], data, offLuma * 3);
            YUVJtoRGB(y[offLuma + 1], u[offChroma], v[offChroma], data, (offLuma + 1) * 3);
            offLuma += 2;
            ++offChroma;
        }
    }

}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:19,代码来源:Yuv422jToRgb.java


示例16: toBufferedImage

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public static BufferedImage toBufferedImage(Picture src) {
    if (src.getColor() != ColorSpace.RGB) {
        Transform transform = ColorUtil.getTransform(src.getColor(), ColorSpace.RGB);
        Picture rgb = Picture.create(src.getWidth(), src.getHeight(), ColorSpace.RGB, src.getCrop());
        transform.transform(src, rgb);
        src = rgb;
    }

    BufferedImage dst = new BufferedImage(src.getCroppedWidth(), src.getCroppedHeight(),
            BufferedImage.TYPE_3BYTE_BGR);

    if (src.getCrop() == null)
        toBufferedImage(src, dst);
    else
        toBufferedImageCropped(src, dst);

    return dst;
}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:19,代码来源:AWTUtil.java


示例17: decodeMBlockIPCM

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public void decodeMBlockIPCM(BitReader reader, int mbIndex, Picture mb) {
    int mbX = mapper.getMbX(mbIndex);

    reader.align();

    int[] samplesLuma = new int[256];
    for (int i = 0; i < 256; i++) {
        samplesLuma[i] = reader.readNBit(8);
    }
    int MbWidthC = 16 >> chromaFormat.compWidth[1];
    int MbHeightC = 16 >> chromaFormat.compHeight[1];

    int[] samplesChroma = new int[2 * MbWidthC * MbHeightC];
    for (int i = 0; i < 2 * MbWidthC * MbHeightC; i++) {
        samplesChroma[i] = reader.readNBit(8);
    }
    collectPredictors(mb, mbX);
}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:19,代码来源:SliceDecoder.java


示例18: transform

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public void transform(Picture src, Picture dst) {
    int[] y = src.getPlaneData(0);
    int[] u = src.getPlaneData(1);
    int[] v = src.getPlaneData(2);

    int[] data = dst.getPlaneData(0);

    int offLuma = 0, offChroma = 0;
    for (int i = 0; i < dst.getHeight(); i++) {
        for (int j = 0; j < dst.getWidth(); j += 2) {
            YUV444toRGB888((y[offLuma] << upShift) >> downShift, (u[offChroma] << upShift) >> downShift,
                    (v[offChroma] << upShift) >> downShift, data, offLuma * 3);
            YUV444toRGB888((y[offLuma + 1] << upShift) >> downShift, (u[offChroma] << upShift) >> downShift,
                    (v[offChroma] << upShift) >> downShift, data, (offLuma + 1) * 3);
            offLuma += 2;
            ++offChroma;
        }
    }

}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:21,代码来源:Yuv422pToRgb.java


示例19: filterBlockEdgeVert

import org.jcodec.common.model.Picture; //导入依赖的package包/类
private void filterBlockEdgeVert(Picture pic, int comp, int x, int y, int indexAlpha, int indexBeta, int bs,
        int blkH) {

    int stride = pic.getPlaneWidth(comp);
    for (int i = 0; i < blkH; i++) {
        int offsetQ = (y + i) * stride + x;
        int p2Idx = offsetQ - 3;
        int p1Idx = offsetQ - 2;
        int p0Idx = offsetQ - 1;
        int q0Idx = offsetQ;
        int q1Idx = offsetQ + 1;
        int q2Idx = offsetQ + 2;

        if (bs == 4) {
            int p3Idx = offsetQ - 4;
            int q3Idx = offsetQ + 3;
            filterBs4(indexAlpha, indexBeta, pic.getPlaneData(comp), p3Idx, p2Idx, p1Idx, p0Idx, q0Idx, q1Idx,
                    q2Idx, q3Idx, comp != 0);
        } else if (bs > 0) {
            filterBs(bs, indexAlpha, indexBeta, pic.getPlaneData(comp), p2Idx, p1Idx, p0Idx, q0Idx, q1Idx, q2Idx,
                    comp != 0);
        }
    }
}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:25,代码来源:DeblockingFilter.java


示例20: encodeFrame

import org.jcodec.common.model.Picture; //导入依赖的package包/类
public ByteBuffer encodeFrame(Picture picture) {
    if (picture.getColor() != ColorSpace.RGB)
        throw new IllegalArgumentException("Only RGB image can be stored in PPM");
    ByteBuffer buffer = ByteBuffer.allocate(picture.getWidth() * picture.getHeight() * 3 + 200);
    buffer.put(JCodecUtil.asciiString("P6 " + picture.getWidth() + " " + picture.getHeight() + " 255\n"));

    int[][] data = picture.getData();
    for (int i = 0; i < picture.getWidth() * picture.getHeight() * 3; i += 3) {
        buffer.put((byte) data[0][i + 2]);
        buffer.put((byte) data[0][i + 1]);
        buffer.put((byte) data[0][i]);
    }

    buffer.flip();

    return buffer;
}
 
开发者ID:PenoaksDev,项目名称:OpenSpaceDVR,代码行数:18,代码来源:PPMEncoder.java



注:本文中的org.jcodec.common.model.Picture类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java JCEMapper类代码示例发布时间:2022-05-21
下一篇:
Java Sensor类代码示例发布时间:2022-05-21
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap