'분류 전체보기'에 해당되는 글 119건

  1. 2008.11.23 Brian Crain - Butterfly Waltz
  2. 2008.10.30 프로그램은 상상이다
  3. 2008.10.29 lloopp
  4. 2008.10.27 2008 순천만 갈대축제
  5. 2008.10.23 A Few Principles of Video Tracking
  6. 2008.10.23 Processing Tutorials
  7. 2008.10.23 서울국제컴퓨터음악제 2009 작품공모
  8. 2008.10.17 WiiPaint Fall 2007
  9. 2008.10.15 그랜드 민트 페스티벌 2008
  10. 2008.10.07 Messing with P5Sunflow

Brian Crain - Butterfly Waltz

|
요즘 연습하고 있는 곡이다. 어려운곡은 아니지만 그래도 나한테는 어렵다.
중간에 가면서 자꾸손가락이 꼬인다.ㅠㅠ
And

프로그램은 상상이다

|
프로그래밍은 상상이다 상세보기
임백준 지음 | 한빛미디어 펴냄
무엇인가! 의 저자 임백준의 『프로그래밍은 상상이다』. 저자가 미국 월 스트리트에서 금융...『프로그래밍은 상상이다』에 수록되지 못한 제5장 컴퓨터 프로그래밍과 사회 는 한빛미디어의...
And

lloopp

|

Noid와 Klaus Filip에 의해서 개발된 Max/msp, Jitter 기반의 오픈소스 프로그램인 

 

Lloopp는 프리웨어로서 누구나 다운받아 사용할 수 있는 인터랙티브 퍼포먼스 프로그램입니다.

이미 영미권의 많은 저명한 아티스트들이 자신들의 작품에 사용하고 있어

그 활용성과 범용성에 대한 평가가 이루어졌으며 현재에서 계속 업데이트가 진행되고 있습니다.

 

http://lloopp.klingt.org/plone/lloopp/


Klaus Filip

오스트리아 출신의 일렉트로닉 뮤지션, 프로그램 개발자.

Klaus Filip의 이미 긴 시간을 음향과 기술의 가능성을 확장하는데 많은 노력을 기울여왔다.  그는 수많은 뮤지션들과 일렉트로닉 음악을 작곡해 왔으며, 사운드 인스톨레이션, 단편실험영화등을 작업하기도 했다. 특히 Max/Msp 기반의 오픈 소스 프로그램인 lloopp의 개발자이기도 하며 현재Christian Fennesz, Boris Hauf, Radu Malfatti 등과 같은 뮤지션들과 협연을 지속하며 프로그램을 발전시키는데 많은 노력을 기울이고 있다

And

2008 순천만 갈대축제

|
And

A Few Principles of Video Tracking

|
The idea of tracking motion on a computer using a video camera has been around a couple of decades, and still is not fully perfect, because the construction of vision is a complex subject. We don't just "see"; we construct colors, edges, objects, depth, and other aspects of vision from the light that reaches our retinas. If you want to program a computer to see in the same way, it has to have subroutines that define the characteristics of vision and allow it to distinguish those characteristics in the array of pixels that comes from a camera. For more on that, see Visual Intelligence: How We Create What We Seeby Donald Hoffman. There are many other texts on the subject, but his is a nice popular introduction. What follows is a very brief introduction to some of the basic concepts behind computer vision and video manipulation.

There are a number of toolkits available for getting data from a camera and manipulating it. They vary from very high-level simple graphical tools to low-level tools that allow you to manipulate the pixels directly. Which one you need depends on what you want to do. Regardless of your application, the first step is always the same: you get the pixels from the camera in an array of numbers, one frame at a time, and do things with the array. Typically, your array is a list of numbers, including the location, and the relative levels or red, green, and blue light at that location.

There are a few popular applications that people tend to develop when they attach a camera to a computer:

Video manipulation takes the image from the camera, changes it somehow, and re-presents it to the viewer in changed form. In this case, the computer doesn't need to be able to interpret objects in the image, because you're basically just applying filters, not unlike Photoshop filters.

Tracking looks for a blob of pixels that's unique, perhaps the brightest blob, or the reddest blob, and tracks its location over a series of frames. Tracking can be complicated, because the brightest blob from one frame to another might not be produced by the same object.

Object recognition looks for a blob that matches a particular pattern, like a face, identifies that blob as an object, and keeps track of its location over time. Object recognition is the hardest of all three applications, because it involves both tracking and pattern recognition. If the object rotates, or if its colors shift because of a lighting change, or it gets smaller as it moves away from the camera, the computer has to be programmed to compensate. If it's not, it may fail to "see" the object, even though it's still there.

There are a number of programs available for video manipulation.Jitter, a plugin for Max/MSP, is a popular one. David Rokeby'ssoftVNS is another plugin for Max. Mark Coniglio's Isadora is a visual programming environment like Max/MSP that's dedicated to video control, optimized for live events like dance and theatre. Image/ine is similar to Isadora, though aging, as it hasn't been updated in a couple of years. There also countless VJ packages that will let you manipulate live video. In addition, most text-based programming languages have toolkits too. Danny Rozin's TrackThemColors Prodoes the job for Macromedia Director MX, as does Josh Nimoy'sMyron. Myron also works for Processing. Dan O'Sullivan's vbp does the job for Java. Dan has an excellent site on the subject as well, with many more links. He's also got a simple example for Processing on his site. Almost all of these toolkits can handle video tracking as well.

There are two methods you'll comm,only find in video tracking software: the zone approach and the blob approach. Software such as softVNS or Eric Singer's Cyclops or cv.jit (a plugin for jitter that affords video tracking) take the zone approach. They map the video image into zones, and give you information about the amount of change in each zone from frame to frame. This is useful if your camera is in a fixed location, and you want fixed zones of that trigger activity. Eric has a good example on his site in which he uses Cyclops to play virtual drums. The zone approach makes it difficult to track objects across an image, however. TrackThemColors and Myron are examples of the blob approach, in that they return information about unique blobs within the image, making it easier to track an object moving across an image.

At the most basic level, a computer can tell you a pixel's position, and its color (if you are using a color camera). From those facts, other information can be determined:

  • The brightest pixel can be determined by seeing which pixel has the highest color values;
  • A "blob" of color can be determined by choosing a starting color, setting a range of variation, and checking the neighboring pixels of a selected pixel to see if they are in the range of variation.
  • Areas of change can be determined by comparing one frame of video with a previous frame, and seeing which pixels have the most significantly different color values.
  • Areas of pattern can be followed by selecting an area to track, and continuing to search for areas that match the pattern of pixels selected. Again, a range of variation can be set to allow for "fuzziness"

A few practical principles follow from this:

Colors to be tracked need consistent lighting. The computer can't tell if my shirt is red, for example; it can tell that one pixel or a range of pixels contains the color value [255,0,0] perhaps, but if the lighting changes and my shirt appears gray because there is no red light for it to reflect, the computer will no longer "see" it as red

Shapes to be tracked need to stay somewhat consistent in shape. The computer doesn't have stereoscopic vision (two eyes that allow us to determine depth by comparing the difference in image that our two eyes receive), so it sees everything as flat. If your hand turns sideways with respect to the camera, the pattern changes because your hand appears thinner. So the computer may no longer recognize your hand as your hand.

One simple way of getting consistent tracking is to reduce the amount of information the computer has to track. For example, if the camera is equipped with an infrared filter, it will see only infrared light. This is very useful, since incandescent sources (lightbulbs with filaments) give off infrared, whereas fluorescent sources don't. Furthermore, the human body doesn't give off infrared light either. This is also useful for tracking in front of a projection, since the image from most LCD projectors contains no infrared light.

When considering where to position the camera, consider what information you want to track. For example, if you want to track a viewer's motion in two dimensions across a floor, then positioning a camera in front of the viewer may not be the best choice. Consider ways of positioning the camera overhead, or underneath the viewer.

Often it is useful to put the tracking camera behind the projection surface, and use a translucent screen, and track what changes on the surface of the screen. This way, the viewer can "draw" with light or darkness on the screen.


출처 : http://www.tigoe.net/pcomp/videoTrack.shtml

And

Processing Tutorials

|
아직 한국이나 외국에서 알차게 짜여진 프로세싱(processing) 튜토리얼은 없으나 그나마 좀 괜찮고 쉽게 접근할 수 있는 튜토리얼들을 모아봤다. 공부하는 학생이나 기술적으로 부족한 인터랙션 디자이너들에게는 도움이 되라라 생각된다. 영어지만 쉽게 이해할 수 있는 정도의 수준임으로 걱정은 할 필요가 없다. 많은 도움이 되길 바란다.

http://itp.nyu.edu/~sve204/icm_fall06/
http://itp.nyu.edu/ICM/james/
http://itp.nyu.edu/icm/shiffman/
http://www.shiffman.net/teaching/workshop/

http://www.thesystemis.com/makingThingsMove/index.html
http://thesystemis.com/eatingVideo/
http://itp.nyu.edu/~dbo3/cgi-bin/ClassWiki.cgi?ICMVideo

도움: Chris O’shea + Processing discourse

출처 : http://www.digitypo.com/blog/entry/Processing-Tutorials


And

서울국제컴퓨터음악제 2009 작품공모

|
서울국제컴퓨터음악제 2009 작품공모 
한국전자음악협회는 서울국제컴퓨터음악제 2009에 연주될 작품들을 공모합니다. 

공모 분야 
1. 테입음악 
2. 악기(8명 이내)와 전자음악(테입 혹은 라이브) 
3. 라이브 전자음악 
4. 오디오-비주얼 미디어 작품 

공모 규정 
1. 작품은 2006년 이후 작곡된 것이어야 함 
2. 작품의 길이는 15분 이내여야 함 
3. 악기를 동반한 전자음악일 경우 연주자는 8명 이내여야 함 
4. 특수한 악기를 동반한 음악일 경우 작곡가의 책임 하에 악기와 연주자를 동반하여야 함 
5. 모든 작품은 8채널까지만 가능 
6. 둘 혹은 그 이상의 작품을 제출할 수 있슴 

공모 접수 마감 (온라인) 
- 2008년 12월 1일 (월) 오후6시 (서울 시각, UTC+9) 

접수 방법 
1. 접수는 온라인 접수만 가능함 
- 웹하드( http://www.webhard.co.kr/ )에 접속 
- 아이디: computermusic / 비밀번호: guest 
- '올리기 전용' 폴더에 자신의 이름으로 폴더를 만든 후 아래와 같은 파일 업로드 
2. 작품 파일 업로드 
- 오디오 파일은 반드시 mp3, 스테레오 버전으로 올릴 것 
- 라이브 전자음악일 경우: 녹음된 오디오 파일(있을 경우, mp3)과 관련 파일(패치, 도큐먼트, 프로그램 등)을 업로드 
- 악기를 동반한 전자음악일 경우 반드시 악보 (PDF) 업로드 
- 오디오-비주얼 작품일 경우: 영상 파일은 mpeg, mov, avi 등의 포맷으로 올리되, 전체 용량이 200MB를 넘지 않게 할 것 
3. 다음 정보를 담은 도큐먼트 파일 업로드 (포맷: TEXT, RTF, PDF, DOC, HWP 중 택일) 
- 성명 
- 성별 
- 국적 
- 전화 (휴대전화) 
- 이메일 
- 홈페이지 (있을 경우) 
- 작품제목 
- 작품길이 
- 공모분야 
- 악기 (있을 경우) 
- 오디오 아웃풋 채널 수 
- 프로그램 노트 
- 프로필 
- 연주시 특별히 필요한 요구 사항 (있을 경우) 
4. 기타 사항 
- 올려진 파일은 다른 사람이 절대 다운로드 할 수 없으니 안심하세요. 
- 올려진 파일은 수정하거나 지울 수 없습니다. 파일을 다시 업로드해야 할 필요가 있다면, 다른 이름으로 다시 올려주시기 바랍니다. 
- 올려진 파일은 접수가 완료된 후 수 일 내에 웹하드에서 삭제됩니다. 
- 온라인 제출이 불가능할 경우 이메일로 문의 바람. 

지원정책 
1. 당선된 작품의 연주에 필요한 비용(연주자 사례비, 악기 렌탈비 등)은 본 회가 지불합니다. 
2. 공모에 당선된 해외 거주 작곡자가 한국에 방문하는 경우, 음악제 기간 동안 숙박 비용을 지불합니다. 
3. 특별한 이유로 작곡자가 연주자를 대동하는 경우 연주자의 숙박 비용 또한 지불합니다. 
* 이 정책은 본 회의 사정에 따라 변경될 수 있습니다. 

문의 및 기타 정보 
master@keams.org 
http://www.computermusic.or.kr 

회장 임영미 

관련URL:  www.computermusic.or.kr/ 
And

WiiPaint Fall 2007

|

WiiPaint
Fall 2007
Quartz Composer+VDMX

During an internship with Vidvox, I began working with Apple's Quartz Composer image software. Vidvox's VDMX VJ software can accept quartz composer patches and also interfaces with a Wii remote. My drawing patch gets the x and y position from the Wiimote, the A and B buttons control the drawing and clearing. The amount of time you stay in one place changes the amount of bleed so it becomes like a spraypaint effect.

The Wiimote is a very interesting piece of technology because it allows for such gestural control, which is lacking in many digital arts expressions. I have set this up in a few live situations and allowed crowd participants to freely interact with it, and it seems like they have a lot of fun playing around with it.

I made a short sample video of the patch within VDMX:



Wii Paint test from blair neal on Vimeo.

And

그랜드 민트 페스티벌 2008

|
그랜드 민트 페스티벌 2008

이번에 그랜드 민트 페스티벌에서 뎁의 Vj를 맡게 되었네요..
히궁..첫 대형공연이라 떨리기도 하고 실수하진 않을까 걱정도되고
열심히 준비하지 못한 아쉬움도 있고...

부끄러움....

And

Messing with P5Sunflow

|

Cube Explosion

Ray tracing is the CG rendering technique used in Pixar movies and most other broadcast quality CG. Basically it bounces millions of virtual photons around the scene to simulate how objects reflect light and cast shadows on each other. This produces super realistic images at the cost of being very computationally expensive.

P5Sunflow is a Processing version of the SunFlow open source Java ray tracing implementation created by Mark Chadwick.

P5Sunflow produces images with creamy shadows and a solid sculptural feel that are quite different to anything you can achieve with most real-time 3D engines. Unfortunately rendering times are really slow. The videos below are overnight renders. I’d be interested to find out if there is some kind of ‘fake’ ray tracing that produces similar results quicker.

Click through to see the HD and downloadable QuickTime versions. These work well looped in QT.


Cube Wall from felixturner on Vimeo.


Sunflow Phase Towers from felixturner on Vimeo.

You can download the Processing sketch for the cube wall animation here. To use it you need to install the P5Sunflow library as described here. To run P5Sunflow you need to use the version of Processing that comes without Java, since it requires Java v1.5 and Processing comes with Java1.4.

And
prev | 1 | ··· | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | next