Monday, December 12, 2011

12/12 - Capstone Exploration Reflection

At Capstones, I interviewed Katt McConnell regarding her experience.

1. What's this week been like?

1a. Lots of Skyrim. In order to have discs, I had to submit my masters in November, so I've been mostly done since then. I've been working on my website.

2. If there's anything you could change, what would it be?

2a. I would've liked to add violin to one of the tracks.

3. How close is the final outcome to your goal?

3a. The album evolved from 12 tracks to 13 when one of the songs was very long, and I decided to split it. Also, I planned to have a "Making Of" video, but did not have enough time to make it.

4. If you had one more month, what would do (When I asked this, I added besides the Making Of video)?


4a. I'm done with it. This was a project where there was always a little more I'd like to do, but I had to draw a line in the sand and say this is it.

5. What advice do you have for those working on capstones?


5a. Add as much padding to your schedule as you can, because shit happens. Editors note: As part of the conversation, she mentioned how early in the year, she became very ill; to the point where she had to re-record all of her vocal tracks in October.




In regards to the class itself, here's my responses to your questions:


1. Address the efficacy of the approach we took in this class.


1a. I think having the class meet for half the semester ended up being less beneficial. I think, assuming scheduling would have permitted, it could have worked to have faculty come for guest lectures leading up to midterm and have past students come as we started working on capstones. 

2. Discuss whether or not the breakdown of the project development was advantageous

2a. If by breakdown, you mean having the deadlines for the capstone idea, mentor setup, etc. it was very advantageous for me. Having additional accountability is always useful for me.

3. Discuss the advantages and/or disadvantages of the student guest speakers for forming your own approach to capstone

3a. The flaw to the student speaker is that all (or many if I'm not remembering correctly) of them worked on it for longer than many of us will have the opportunity to work on. However, having us visit Capstones was useful as a counter-point, as many of them only worked for the semester that it was due, and (for some more than others) it showed.

4. Were there any assignments that you thought were particularly helpful? Please elaborate

4a. The peer group assignment ended up being the standout. Finding out I wasn't the only person doing a (relatively) crazy-scope, innovative-concept, programming-heavy capstone was both refreshing and reassuring. Also, we ended up meeting about every other week, so it, once again, gave me another person/group to be accountable to in regards of making progress.
 
5. Were there any assignments that you thought were particularly useless? Please elaborate.

5a. I may be a fringe case for this, but the Secondary Mentor assignment became useless to me after completing the Primary Mentor assignment. I understand and appreciate the reasoning behind it, but a combination of a dearth of truly applicable mentors as well as good responses from the potential primary mentors made it difficult and almost busy-work to find and send out requests to potential secondary mentors.

Tuesday, November 29, 2011

Progress Report - 11/29/11

1) What concrete progress have you made since you submitted your proposal and timeline?


1a) As mentioned in my last progress report, I've set up some branding elements (Twitter account, Website), and set up version control. I've also refined my hand-tracking system to have smoother animation and a truly 'still' cursor when your hand is reasonably still. I'm currently in the process of implementing relational tracking for the hands. Immediately following that will be implementing tracking for both the active hand and the inactive one to allow for a smooth cursor handoff, for lack of a better term.


2) How many times have you been in contact with your mentor? List what you've discussed.


2a) I've met with my mentor twice, briefly since our first formal meeting where she approved my timeline and proposal. We discussed version control solutions and setup, and she also helped me write the simple algorithm I'm using for animation smoothing. We plan on meeting on Thursday, where I hope to discuss the relational tracking and help build my immediate timeline.


3) How has your timeline changed thus far? Discussed what changed and why you changed it.


3a) Primarily, the biggest change has been pushing back my goal-lines for this semester. I had hoped to have a very basic prototype done, and I'm still not there yet. After the goals mentioned in the answer to question 1, the last step before I feel ready to start writing the actual application is going to be gesture tracking, but that's pending my discussion with Dr. Baker. We may decide that it's worth putting on the back-burner between getting a rough interface written and the beginning of next semester, assuming that getting the interface isn't very fast.


4) What do you still need to complete in preparation for 499? Identify key aspects that need to be taken care of before your capstone starts.


4a) To itemize and elaborate on 3a a bit more, here's what HAS to be done:

  • Hand tracking (both hands, with smoothing): tracking both hands is necessary for both the smooth transition for the cursor as well as leaving me a hook to assist in gesture tracking.
  • Interface building: Right now, the 'application' is an empty canvas to move a cursor in. At the very least, I need to build a testing ground for different bounding areas and time-to-select so I can test those.
  • ** Gesture tracking: I'm envisioning this being complex, assuming there aren't extensive libraries already available for use, so this is pending discussion. When hand tracking is complete, I will have the skeleton on which to add the meat of gesture-tracking, but having gestures will also require having an interface to gesture through, and writing my own gesture solution may require more calculations which may negatively impact performance.
  • Coming up with a good solution to bring the Kinect with me: It's about as simple as it sounds. The Kinect is relatively fragile, and I need a good way to be able to bring it to campus with me. It may just be as simple as wrapping it in a towel or something and putting it in my bag, but I'll just need to find a way to prevent any pressure that may strain the motors.
5) If there have been significant changes to your project since your proposal, please discuss them.

5a) Nothing in particular. The biggest thing is I'm only planning on using 1 Kinect as opposed to the potential of 3.

Sunday, October 30, 2011

Progress Report - 10/30/11

I just wanted to give an overview of everything I've done since my last post:


  • I set up a Twitter account
  • I made a web page that gives a small skeletal sample of the aesthetic I'm hoping to go for with the final program
  • I set up a Subversion-based version control solution

In bigger events, I mostly finished with a gut-and-rewrite of the SkeletalViewer into original (or at least mostly original code). The best thing that came from that process is a better understanding of how that program went about its tracking. With that knowledge, I've also re-written my tracking code with the intent to allow for a higher level program-awareness of which hand is active as well as smoothing the transition between active hands.

I've also implemented a rough attempt of an animation smoothing solution for the cursor. What I do is create an array of X and Y values that holds the last 15 of each. Then instead of using the hand's current X/Y position, it uses the average of the arrays. It has, at the very least, allowed a 'still' hand to appear more still. Unfortunately, my means of outputting the array values (stepping through the array in a loop and output each i value) wreaks havoc on the program performance, so I'm unsure if the outputs of all identical values was due to an error in my code or due to the video feeds not being read at the 30 FPS it's supposed to.

Also, with C# being the first strict (or even strict-ish) I've written in, it was interesting to find out how static Arrays are in C# versus PHP, Javascript, or Actionscript.

Tuesday, October 11, 2011

Timeline Creation - 10/11/11

I began my process by sitting down with the skeleton of my project:

  • Kinect interaction through gesture and body tracking
  • Displayed in VR Theater and/or an array of computer monitors (for when the cave is used by another student)
  • Twitter/Facebook data (read-only)
I then came up with milestones to reach those features
  • Rough (but navigable) mockup of the interface (no Twitter data, not necessarily tested/designed for Cave)
  • Functional Prototype (framework of aesthetic design implemented, some Twitter data, designed to be tested in the final arrangement)
  • Final skeleton (finalized aesthetics, Facebook and/or other social media sources, optimized for final arrangement0
I thought about what would be involved for each milestone, and came with rough chunks of time at which it would still be acceptable to be done.
  • Rough - Middle of November
  • Functional Proto - Beginning of Capstone Semester
  • Final Skeleton - Beginning of April
The skeleton was designed to be simple, yet complete enough to be worth showing off. This allows me to have a target that I can compare to and judge how much time I may or may not have to use to add additional features (such as being able to send messages, the twist on displaying other users in the timelines, and being able to experiment further)

The idea is that the functional prototype is ready to test at the beginning of the capstone semester so that I can start a schedule of test one week and refine the next. It's such a simple framework that it could reasonably be modified to fulfill my time needs, whether that means more time to test or refine, or more consecutive time for either. Ideally I would continue this process of refine and test up until April, at which time I would make any and all final touches. I would also be adding (and then testing) features on the fly if I feel I'm in a position to do so.

Overall, I've front-loaded my clearest areas of flexibility to before the capstone semester starts, since my current schedule is more packed than my capstone semester will be and therefore there's greater opportunity for things to fall through the cracks. Also, having flexibility in the R+D stage of production lends me more time to experiment before I have to focus on having a complete product.

Sunday, October 9, 2011

More Progress - 10/09/11



I now have hand tracking independent of the pre-drawn skeleton. I map the 'cursor' to the first hand it can track that is above the hips and on its side of the body. For instance, if the right hand crosses over the center of the body, it stops tracking and will switch tracking to the other hand if it's valid.

Tuesday, September 27, 2011

Project Development Issues 9-27-11

Time Management Issues

Working two part-time jobs while being a full-time student makes finding time to work on the Capstone a high-wire act. Using history as an example, if I don't give myself time during the week to relax, I run the risk of self-destructing. I'm not too worried about burning myself out as I'm enjoying the work so far and next semester, I will be working only one of the two jobs at the most.

Required Skills

  • C# Programming - I'm comfortable with this as C# is syntactically similar to ActionScript and Javascript, languages that I'm more proficient with
  • Usability Design - I have a general "eye" for this type of design, but I hope to have this augmented by getting Dr. Pfaff as a secondary mentor, if things work out
  • Aesthetic Design - This is probably my weakest area, but I've conscripted some help from friends of mine on Twitter who are skilled in the type of design I'm looking to use for this program


Resources

  • Kinect - already acquired prior to starting the project, but I may need to get two more to have high fidelty control on the side-screens of the cave
  • Nyko Zoom for Kinect (wide angle lens) - Same as above except purchased recently
  • Virtual Reality Theater access - This is for the screens as their size allows for a superior sense of immersion when compared to an arrangement of three standard-sized monitors or televisions.
  • A laptop capable of running the final product - My current laptop may be capable of handling this, but I will likely be upgrading later this year
  • Visual Studio 2010 - already acquired through DreamSpark


Budget

  • Nyko Zoom - $30
  • New laptop - between $1000 - $1500
  • Potential new Kinects - $300 for two
  • Potential new Zooms - $60 for two
  • If I somehow end up with a surplus of time and money, I may look into porting the program to become a touch-based Windows 8 app, which would require either a tablet or laptop with a touchscreen. Tablets run between $500 and $1000, but that is a high-end, spit-and-polish "feature" and not something I'm planning for


Team Issues

The biggest team issue I see is going to be working with the Twitter friends to get UI designs. I have never been on a client-esque side of a project, and being friends with the people who I'm working with will either be an incredible boon or the bane of my existence, and it's early to the point I have no idea which way it will go. I don't have any concerns about the quality of their work, I just don't know how we will work through getting content to and from each other on a timely basis.

Sunday, September 25, 2011

The Face of Progress (and the head of infinitely changing colors) UPDATED



Modified the pre-built Skeleton Viewer code in the Kinect SDK to demonstrate control over individual points of articulation... by randomly changing the color and size of the head Point.


I am now able to do comparisons of the position of Points of Articulation. The left arm becomes thicker as the the left hand is higher than the left shoulder. While positive mundane, this ability will potentially be the linchpin to large portions of my interface. Being able to detect and compare the position of joints in the arm will allow me to allow people to have special commands that are enabled by holding your an arm a certain way.

Ideally, there will be pre-built gesture support (that I haven't yet investigated), but presuming there isn't (or presuming that the gestures I want to use aren't supported), I now know I can build them myself. It probably isn't necessary to say that's a big deal.