Please note for installation of gems, and some other operations netbeans can be a bit slow:-
Download and unzip the plugin yields ~/arch/build/updates/*.* note location, which you will need to navigate to when installing "downloaded" plugins (from netbeans menu). Select all the *.nbm modules and the jruby.jar. And install see below:-
Then install the downloaded gem as install local (local install required to load --pre gem) and you are good to go a real ide to develop jruby_art who needs a processing mode?
See crude Templates here
To install templates navigate to .netbeans/config/Templates (create a Templates folder if it does not exist) create a sub folder 'Ruby' put the outer .nbattrs file here, create a folder jruby_art and place the other .nbattrs file there along with both ruby sketch files. Templates should now show up in a Ruby sub-folder.
Experiments with ruby-processing (processing-2.2.1) and JRubyArt for processing-3.0
Saturday, 31 January 2015
Installing netbeans ruby-plugin and jruby_art gem
Labels:
jruby_art,
netbeans,
ruby plugin
Friday, 30 January 2015
JRubyArt vs ruby-processing
IMPORTANT THIS DOCUMENT IS OUTDATED BUT BY NOW YOU SHOULD ALREADY BE USING JRubyArt or propane
External Dependencies
- ruby-processing requires an installed version of vanilla-processing
- jruby_art includes processings core jars in gem
Runnable from netbeans
- ruby-processing sketches would need a Rakefile or some other script to run
- jruby_art sketches mainly just run (except for bare-sketches), opens possibility of incredibly simple/minimal installation from netbeans (no need for MRI ruby or an installed jruby, might suit windows users)
Core processing libraries (sound, video etc)
- ruby-processing has builtin support
- jruby_art now has built in support (but path to libraries is configurable), can also be modular via gem see pbox2d as exemplar
Contributed processing libraries
- ruby-processing has builtin support for libraries installed by processing ide
- jruby_art does not assume that vanilla processing is installed. Wrapping libraries in separate gem would be ideal see pbox2d and toxicgem
Vec2D, Vec3D, Arcball and DEGLUT
- ruby-processing requires load_library
- jruby_art automatically loaded into jruby runtime
Processing::Proxy module
- in ruby-processing Proxy can be used to mimic processings inner classes, probably v. bad juju convenience isn't everything
- in jruby_art a different Proxy is used to access some processing methods eg "pre", "post", "draw" via reflection
Labels:
jruby_art,
libraries,
netbeans,
ruby-processing
Thursday, 29 January 2015
Netbeans, the perfect ide for jruby_art
The perfect ide for jruby_art is netbeans, because all sketches just run (since the netbeans ruby plugin is using a built in jruby-complete) with the jruby command see below (actually bare sketches need k9 command). It is probably easier at this point to download jruby_art-0.2.0.pre.gem and do a local install. Thus it is quite possible to develop jruby_art without a jruby/ruby install, just using the jruby provided by the jruby netbeans plugin (this might actually make sense on windows). However you won't get the examples so make sure you get them here.
Monday, 26 January 2015
Cranky keyboard (unless you are a northern European) sound library example
# This example shows how to make a simple sampler and sequencer with the Sound # library. In this sketch 5 different short samples are loaded and played back # at different pitches, in this when 5 different octaves. The sequencer # triggers and event every 200-1000 mSecs randomly. Each time a sound is # played a colored rect with a random color is displayed. load_library :sound include_package 'processing.sound' # Define the number of samples NUM_SOUNDS = 5 attr_reader :device, :file, :value def setup size(640, 360) background(255) # Create a Sound renderer and an array of empty soundfiles @device = AudioDevice.new(self, 48_000, 32) @file = [] @value = Array.new(3, 0) # Load 5 soundfiles from a folder in a for loop. By naming the files 1., 2., # 3., n.aif it is easy to iterate through the folder and load all files in # one line of code. NUM_SOUNDS.times do |i| file << SoundFile.new(self, format('%d.aif', (i + 1))) end end def draw background(*value) # splat array values end def key_pressed defined = true case key when 'a' file[0].play(0.5, 1.0) when 's' file[1].play(0.5, 1.0) when 'd' file[2].play(0.5, 1.0) when 'f' file[3].play(0.5, 1.0) when 'g' file[4].play(0.5, 1.0) when 'h' file[0].play(1.0, 1.0) when 'j' file[1].play(1.0, 1.0) when 'k' file[2].play(1.0, 1.0) when 'l' file[3].play(1.0, 1.0) when 'ö' file[4].play(1.0, 1.0) when 'ä' file[0].play(2.0, 1.0) when 'q' file[1].play(2.0, 1.0) when 'w' file[2].play(2.0, 1.0) when 'e' file[3].play(2.0, 1.0) when 'r' file[4].play(2.0, 1.0) when 't' file[0].play(3.0, 1.0) when 'z' file[1].play(3.0, 1.0) when 'u' file[2].play(3.0, 1.0) when 'i' file[3].play(3.0, 1.0) when 'o' file[4].play(3.0, 1.0) when 'p' file[0].play(4.0, 1.0) when 'ü' file[1].play(4.0, 1.0) else defined = false # only set background color value, if key is defined end @value = [rand(0..255), rand(0..255), rand(0..255)] if defined end
Labels:
keyboard,
library,
ruby-processing,
sound,
umlaut
Sunday, 25 January 2015
Another sound library example translated to ruby-processing
The original sketch was a blank-screen here is lame attempt to make it a bit more interesting, note use of ruby-processings map1d convenience method in favour of vanilla processings map convenience method (map is used for a different function in ruby, and python and a host of other languages for that matter). This one of those sketches that needs to use jruby-complete (some permission thing according to headius).
# # This is a sound file player. # NB: requires jruby-complete to run # either --nojruby flag or use config # load_library :sound include_package 'processing.sound' attr_reader :sound_file def setup size 640, 360 background 255 no_stroke # Load a soundfile @sound_file = SoundFile.new(self, 'vibraphon.aiff') report_settings # Play the file in a loop sound_file.loop end def draw red = map1d(mouse_x, (0..width), (30..255)) green = map1d(mouse_y, (height..0), (30..255)) fill(red, green, 0, 100) ellipse(mouse_x, mouse_y, 10, 10) manipulate_sound end def manipulate_sound # Map mouse_x from 0.25 to 4.0 for playback rate. 1 equals original playback # speed 2 is an octave up 0.5 is an octave down. sound_file.rate(map1d(mouse_x, (0..width), (0.25..4.0))) # Map mouse_y from 0.2 to 1.0 for amplitude sound_file.amp(map1d(mouse_y, (0..width), (0.2..1.0))) # Map mouse_y from -1.0 to 1.0 for left to right sound_file.pan(map1d(mouse_y, (0..height), (-1.0..1.0))) end def report_settings # These methods return useful infos about the file p format('SFSampleRate= %d Hz', sound_file.sample_rate) p format('SFSamples= %d samples', sound_file.frames) p format('SFDuration= %d seconds', sound_file.duration) end
Labels:
library,
ruby-processing,
sound
Testing the new processing sound library in ruby-processing
There is a new processing sound library for processing-3.0 (that also works with processing-2.0), I thought I would give it a run-out in ruby-processing:-
# This example shows how to create a cluster of sine oscillators, change the # frequency and detune them depending on the position of the mouse in the # renderer window. The Y position determines the basic frequency of the # oscillator and X the detuning of the oscillator. The basic frequncy ranges # between 150 and 1150 Hz load_library :sound include_package 'processing.sound' # The number of oscillators NUM_SINES = 5 # A for calculating the amplitudes attr_reader :sine_volume, :sine_waves def setup size 500, 500 background 255 no_stroke create_oscillators end def create_oscillators # Create the oscillators and amplitudes @sine_waves = [] @sine_volume = [] NUM_SINES.times do |i| # The overall amplitude shouldn't exceed 1.0 # The ascending waves will get lower in volume the higher the frequency sine_volume << (1.0 / NUM_SINES) / (i + 1) # Create the Sine Oscillators and start them wav = SinOsc.new(self) wav.play sine_waves << wav end end def draw fill mouse_x, mouse_y, 0, 100 ellipse(mouse_x, mouse_y, 10, 10) # Use mouse_y to get values from 0.0 to 1.0 yoffset = (height - mouse_y) / height.to_f # Set that value logarithmically to 150 - 1150 Hz frequency = 1000**yoffset + 150 # Use mouse_x from -0.5 to 0.5 to get a multiplier for detuning the # oscillators detune = mouse_x.to_f / width - 0.5 # Set the frequencies, detuning and volume sine_waves.each_with_index do |wav, i| wav.freq(frequency * (i + 1 + i * detune)) wav.amp(sine_volume[i]) end end
Labels:
library,
processing.org,
ruby-processing,
sound
Thursday, 22 January 2015
Mirror, a Dan Shiffman video capture sketch
# # Mirror # by Daniel Shiffman. # # Each pixel from the video source is drawn as a rectangle with rotation # based on brightness. load_library :video include_package 'processing.video' # Size of each cell in the grid CELL_SIZE = 20 # Variable for capture device attr_reader :cols, :rows, :video def setup size(640, 480) frameRate(30) @cols = width / CELL_SIZE @rows = height / CELL_SIZE color_mode(RGB, 255, 255, 255, 100) @video = Capture.new(self, width, height) # Start capturing the images from the camera video.start background(0) end def draw return unless video.available video.read video.load_pixels # Begin loop for columns cols.times do |i| # Begin loop for rows rows.times do |j| # Where are we, pixel-wise? x = i * CELL_SIZE y = j * CELL_SIZE # Reversing x to mirror the image loc = (video.width - x - 1) + y * video.width r = red(video.pixels[loc]) g = green(video.pixels[loc]) b = blue(video.pixels[loc]) # Make a new color with an alpha component c = color(r, g, b, 75) # Code for drawing a single rect # Using translate in order for rotation to work properly push_matrix translate(x + CELL_SIZE / 2, y + CELL_SIZE / 2) # Rotation formula based on brightness rotate((2 * PI * brightness(c) / 255.0)) rect_mode(CENTER) fill(c) no_stroke # Rects are larger than the cell for some overlap rect(0, 0, CELL_SIZE + 6, CELL_SIZE + 6) pop_matrix end end end
Wednesday, 21 January 2015
Experimenting with refinements
Now unfortunately refinements are not yet supported with jruby-9.0.0.0 but I'm going report experiment here (with mri ruby 2.2.0) to replace ruby-processing monkey patched string with refinement. Thought to self I could probably use forwardable to tidy things up a bit.
But perhaps we don't need refinements with use of 'forwardable'
Output:-
What is more using these two classes I can completely replace the monkey-patching of String in JRubyArt and hence probably ruby-processing................
# test refinement to replace monkey patching module StringUtil refine String do def titleize underscore.humanize.gsub(/\b([a-z])/) { $1.capitalize } end def humanize gsub(/_id$/, '').gsub(/_/, ' ').capitalize end def camelize(first_letter_in_uppercase = true) if first_letter_in_uppercase gsub(/\/(.?)/) { '::' + $1.upcase }.gsub(/(^|_)(.)/) { $2.upcase } else first + camelize[1..-1] end end def underscore gsub(/::/, '/') .gsub(/([A-Z]+)([A-Z][a-z])/, '\1_\2') .gsub(/([a-z\d])([A-Z])/, '\1_\2') .tr('-', '_') .downcase end end end # Using StringUtil class Test using StringUtil def initialize(title) @title = title end def titleize @title.titleize end def humanize @title.humanize end end fred = Test.new('bloody_toad') puts fred.humanize puts fred.titleizeOutput:-
Bloody toad Bloody Toad
But perhaps we don't need refinements with use of 'forwardable'
require 'forwardable' # Avoid the monkey patching of String for camelize class CamelString extend Forwardable def_delegators(:@string, *String.public_instance_methods(false)) def initialize(str) @string = str end def camelize(first_letter_in_uppercase = true) if first_letter_in_uppercase @string.gsub(/\/(.?)/) { '::' + $1.upcase }.gsub(/(^|_)(.)/) { $2.upcase } else @string[0] + camelize[1..-1] end end end test = 'test_case' puts CamelString.new(test).camelize puts CamelString.new(test).camelize false
Output:-
TestCase testCase
require 'forwardable' # Avoid the monkey patching of String for underscore/titleize/humanize class StringExtra extend Forwardable def_delegators(:@string, *String.public_instance_methods(false)) def initialize(str) @string = str end def titleize gsub(/::/, '/') .gsub(/([A-Z]+)([A-Z][a-z])/, '\1_\2') .gsub(/([a-z\d])([A-Z])/, '\1_\2') .tr('-', '_') .downcase .gsub(/_id$/, '') .gsub(/_/, ' ').capitalize .gsub(/\b([a-z])/) { $1.capitalize } end def humanize gsub(/_id$/, '').gsub(/_/, ' ').capitalize end def underscore gsub(/::/, '/') .gsub(/([A-Z]+)([A-Z][a-z])/, '\1_\2') .gsub(/([a-z\d])([A-Z])/, '\1_\2') .tr('-', '_') .downcase end end test = 'TestCase' puts StringExtra.new(test).underscore puts StringExtra.new(test).titleize puts StringExtra.new(test).humanize
Testcase test_case Test Case
What is more using these two classes I can completely replace the monkey-patching of String in JRubyArt and hence probably ruby-processing................
Sunday, 18 January 2015
Configuring ruby processing
There is this gui for configuring ruby-processing, it is a regular processing sketch you can run in the processing ide.
You should click the noruby button if you have not installed jruby (and do not intend to install jruby) on your system (but then you may find you can't run sketches using other gems).
You could use processing-3.0a5 instead of processing-2.2.1 (but then use the processing ide to install the sound/minim and video libraries, since these are no longer included in processing-3.0).
The suggested location of processing root will adapt for your OS (a typical linux location is shown, Archlinux uses /usr/share/processing, others may choose /opt/processing).
This is an example config file:-
Or create your own .rp5rc 'yaml' file in $HOME, with an editor such as vim (configRP5 saves as 'json', psych can use either form) X_OFF and Y_OFF can be used to control the position of the sketch on screen:-
Windows users might benefit from setting the 'RP5_ROOT' environmental variable here (point it to where gem gets installed). You can also enter global 'java_args' here if you wish, however using java_args.txt in the sketches data folder is more flexible. Though apparently a .jrubyrc in a local folder, takes precedence, this may be very useful to run say shader sketches (a possible alternative to my custom Rakefile to configure autorun sketches?). Anyone using processing-3.0+ should be aware that sketchbook path setting might be changed for processing-3.0+ use JRubyArt instead of ruby-processing.
You should click the noruby button if you have not installed jruby (and do not intend to install jruby) on your system (but then you may find you can't run sketches using other gems).
You could use processing-3.0a5 instead of processing-2.2.1 (but then use the processing ide to install the sound/minim and video libraries, since these are no longer included in processing-3.0).
The suggested location of processing root will adapt for your OS (a typical linux location is shown, Archlinux uses /usr/share/processing, others may choose /opt/processing).
Vanilla processing sketch configRP5.pde |
This is an example config file:-
{ "JRUBY": "false", "PROCESSING_ROOT": "/home/tux/processing-3.0a5", "X_OFF": 192, "Y_OFF": 108 }
Or create your own .rp5rc 'yaml' file in $HOME, with an editor such as vim (configRP5 saves as 'json', psych can use either form) X_OFF and Y_OFF can be used to control the position of the sketch on screen:-
--- PROCESSING_ROOT: /home/tux/processing-2.2.1 JRUBY: "false" X_OFF: 50 Y_OFF: 50
Windows users might benefit from setting the 'RP5_ROOT' environmental variable here (point it to where gem gets installed). You can also enter global 'java_args' here if you wish, however using java_args.txt in the sketches data folder is more flexible. Though apparently a .jrubyrc in a local folder, takes precedence, this may be very useful to run say shader sketches (a possible alternative to my custom Rakefile to configure autorun sketches?). Anyone using processing-3.0+ should be aware that sketchbook path setting might be changed for processing-3.0+ use JRubyArt instead of ruby-processing.
Labels:
configuration,
installation,
ruby-processing
Friday, 16 January 2015
A Golan Levin video capture sketch translate to ruby-processing
# # Background Subtraction # by Golan Levin. # translated to ruby-processing by Martin Prout # Detect the presence of people and objects in the frame using a simple # background-subtraction technique. To initialize the background, press a key. # load_library :video include_package 'processing.video' attr_reader :background_pixels, :number_of_pixels, :video def setup size(640, 480) init_video end def init_video # This the default video input, see the test_capture # example if it creates an error @video = Capture.new(self, width, height) # Start capturing the images from the camera video.start # Create array to store the background image @number_of_pixels = video.width * video.height @background_pixels = Array.new(number_of_pixels, 0) # Make the pixels[] array available for direct manipulation load_pixels end def capture # captureEvent does not work like vanilla processing @video.read background 0 end def draw return unless (video.available == true) capture video.load_pixels # Make the pixels of video available # Difference between the current frame and the stored background # current_sum = 0 number_of_pixels.times do |i| # For each pixel in the video frame... # Fetch the current color in that location, and also the color # of the background in that spot curr_color = video.pixels[i] bkgd_color = background_pixels[i] # Extract the red, green, and blue components of the current pixel's color curr_r = curr_color >> 16 & 0xff curr_g = curr_color >> 8 & 0xff curr_b = curr_color & 0xff # Extract the red, green, and blue of the background pixel's color bkgd_r = bkgd_color >> 16 & 0xff bkgd_g = bkgd_color >> 8 & 0xff bkgd_b = bkgd_color & 0xff # Compute the difference of the red, green, and blue values diff_r = (curr_r - bkgd_r).abs diff_g = (curr_g - bkgd_g).abs diff_b = (curr_b - bkgd_b).abs # Add these differences to the running tally # current_sum += diff_r + diff_g + diff_b # Render the difference image to the screen pixels[i] = (diff_r << 16) | (diff_g << 8) | diff_b end update_pixels # Notify that the pixels[] array has changed # p current_sum # Print out the total amount of movement end def key_pressed video.load_pixels @background_pixels = video.pixels.clone end
Labels:
Golan Levin,
ruby-processing,
video capture
Sunday, 11 January 2015
A simplified ASCII video capture sketch in ruby processing
In ruby it is often easier to see wood from trees, and code is somewhat self documenting:-
Character placement might be somewhat more regimented in this version, but it saves a heap of push pop matrix calls.
Character placement might be somewhat more regimented in this version, but it saves a heap of push pop matrix calls.
# # Simplified ASCII Video capture sketch in ruby-processing # by Martin Prout after a Ben Fry original # # # Text chars have been used to represent images since the earliest computers. # This sketch is a simple homage that re-interprets live video as ASCII text. # See the key_pressed function for more options, like changing the font size. # load_library :video include_package 'processing.video' attr_reader :bright, :char, :cheat_screen, :font, :font_size, :letters, :video # All ASCII characters, sorted according to their visual density LETTER_STRING = %q{ .`-_':,;^=+/\"|)\\<>)iv%xclrs{*}I?!][1taeo7zjLunT#JCwfy325Fp6mqSghVd4EgXPGZbYkOA&8U$@KHDBWNMR0Q} LETTER_ORDER = LETTER_STRING.scan(/./) def setup size(640, 480) init_video @font_size = 1.5 @font = load_font(data_path('UniversLTStd-Light-48.vlw')) # for the 256 levels of brightness, distribute the letters across # the an array of 256 elements to use for the lookup @letters = (0...256).map do |i| LETTER_ORDER[map1d(i, (0...256), (0...LETTER_ORDER.length))] end # current brightness for each point @bright = Array.new(video.width * video.height, 128) end def init_video # This the default video input, see the test_capture # example if it creates an error @video = Capture.new(self, 160, 120) # Start capturing the images from the camera video.start @cheat_screen = false end def capture # captureEvent does not work like vanilla processing @video.read background 0 end def draw return unless (video.available == true) capture hgap = width / video.width vgap = height / video.height scale([hgap, vgap].max * font_size) text_font(font, font_size) index = 0 video.load_pixels (0...video.height).each do |y| # Move down for next line (0...video.width).each do |x| pixel_color = video.pixels[index] # Faster method of calculating r, g, b than red(), green(), blue() r = pixel_color >> 16 & 0xff g = pixel_color >> 8 & 0xff b = pixel_color & 0xff # Calculate brightness as luminance: # luminance = 0.3*red + 0.59*green + 0.11*blue # Or you could instead red + green + blue, and make the the values[] array # 256*3 elements long instead of just 256. pixel_bright = [0.3 * r, 0.59 * g, 0.11 * b].max # The 0.1 value is used to damp the changes so that letters flicker less diff = pixel_bright - bright[index] bright[index] += diff * 0.1 fill(pixel_color) text(letters[bright[index]], x / font_size, y / font_size) # Move to the next pixel index += 1 end end set(0, height - video.height, video) if cheat_screen end MESSAGE = <<-EOS Controls are: g to save_frame, f & F to set font size c to toggle cheat screen display EOS def key_pressed case key when 'g' then save_frame when 'c' then @cheat_screen = !cheat_screen when 'f' then @font_size *= 1.1 when 'F' then @font_size *= 0.9 else warn MESSAGE end end
Labels:
ascii,
Ben Fry,
jruby,
ruby-processing,
video capture
Saturday, 10 January 2015
A Ben Fry video capture sketch in ruby-processing
A more funky video capture example in ruby-processing, in my hands this sketch seems to require the --nojruby flag.
# # ASCII Video # by Ben Fry, translated to ruby-processing by Martin Prout. # # # Text chars have been used to represent images since the earliest computers. # This sketch is a simple homage that re-interprets live video as ASCII text. # See the key_pressed function for more options, like changing the font size. # load_library :video include_package 'processing.video' attr_reader :bright, :char, :cheat_screen, :font, :font_size, :letters, :video # All ASCII characters, sorted according to their visual density LETTER_STRING = %q{ .`-_':,;^=+/\"|)\\<>)iv%xclrs{*}I?!][1taeo7zjLunT#JCwfy325Fp6mqSghVd4EgXPGZbYkOA&8U$@KHDBWNMR0Q} LETTER_ORDER = LETTER_STRING.scan(/./) def setup size(640, 480) init_video @font_size = 1.5 @font = load_font(data_path('UniversLTStd-Light-48.vlw')) # for the 256 levels of brightness, distribute the letters across # the an array of 256 elements to use for the lookup @letters = (0...256).map do |i| LETTER_ORDER[map1d(i, (0...256), (0...LETTER_ORDER.length))] end # current brightness for each point @bright = Array.new(video.width * video.height, 128) end def init_video # This the default video input, see the test_capture # example if it creates an error @video = Capture.new(self, 160, 120) # Start capturing the images from the camera video.start @cheat_screen = false end def capture_event(c) c.read background 0 end def draw return unless (video.available == true) capture_event(video) push_matrix hgap = width / video.width vgap = height / video.height scale([hgap, vgap].max * font_size) text_font(font, font_size) index = 0 video.load_pixels (0...video.height).each do # Move down for next line translate(0, 1.0 / font_size) push_matrix (0...video.width).each do pixel_color = video.pixels[index] # Faster method of calculating r, g, b than red(), green(), blue() r = pixel_color >> 16 & 0xff g = pixel_color >> 8 & 0xff b = pixel_color & 0xff # Another option would be to properly calculate brightness as luminance: # luminance = 0.3*red + 0.59*green + 0.11*blue # Or you could instead red + green + blue, and make the the values[] array # 256*3 elements long instead of just 256. pixel_bright = [r, g, b].max # The 0.1 value is used to damp the changes so that letters flicker less diff = pixel_bright - bright[index] bright[index] += diff * 0.1 fill(pixel_color) text(letters[bright[index]], 0, 0) # Move to the next pixel index += 1 # Move over for next character translate(1.0 / font_size, 0) end pop_matrix end pop_matrix # image(video, 0, height - video.height) # set() is faster than image() when drawing untransformed images set(0, height - video.height, video) if cheat_screen end MESSAGE = <<-EOS Controls are: g to save_frame, f & F to set font size c to toggle cheat screen display EOS # # Handle key presses: # 'c' toggles the cheat screen that shows the original image in the corner # 'g' grabs an image and saves the frame to a tiff image # 'f' and 'F' increase and decrease the font size # def key_pressed case key when 'g' then save_frame when 'c' then @cheat_screen = !cheat_screen when 'f' then @font_size *= 1.1 when 'F' then @font_size *= 0.9 else warn MESSAGE end end
Labels:
Ben Fry,
capture,
ruby-processing,
video
Video capture with ruby-processing
Simple sketch to get you started with video capture on ruby-processing, makes sense to set sketch to dimensions you think that your camera can work with (use cameras instead of select for an unfiltered list, but it can be very long list as many low resolution cameras get listed, each with lots of fps settings).
load_library :video include_package 'processing.video' attr_reader :cam def setup size(960, 544) cameras = Capture.list fail 'There are no cameras available for capture.' if (cameras.length == 0) p 'Matching cameras available:' size_pattern = Regexp.new(format('%dx%d', width, height)) select = cameras.grep size_pattern # filter available cameras select.map { |cam| p cam } fail 'There are no matching cameras.' if (select.length == 0) start_capture(select[0]) end def start_capture(cam_string) # The camera can be initialized directly using an # element from the array returned by list: @cam = Capture.new(self, cam_string) p format('Using camera %s', cam_string) cam.start end def draw return unless (cam.available == true) cam.read image(cam, 0, 0) # The following does the same, and is faster when just drawing the image # without any additional resizing, transformations, or tint. # set(0, 0, cam) end
Labels:
capture,
ruby-processing,
video
Subscribe to:
Posts (Atom)
Followers
Blog Archive
-
▼
2015
(51)
-
▼
January
(13)
- Installing netbeans ruby-plugin and jruby_art gem
- JRubyArt vs ruby-processing
- Netbeans, the perfect ide for jruby_art
- Cranky keyboard (unless you are a northern Europea...
- Another sound library example translated to ruby-p...
- Testing the new processing sound library in ruby-p...
- Mirror, a Dan Shiffman video capture sketch
- Experimenting with refinements
- Configuring ruby processing
- A Golan Levin video capture sketch translate to ru...
- A simplified ASCII video capture sketch in ruby pr...
- A Ben Fry video capture sketch in ruby-processing
- Video capture with ruby-processing
-
▼
January
(13)
About Me
- monkstone
- I have developed JRubyArt and propane new versions of ruby-processing for JRuby-9.1.5.0 and processing-3.2.2