Jul 03

Awhile back, Tim Buntel called for some real resons why people wanted to see interfaces added to ColdFusion. At CFUNITED the debate raged on… I was feeling relatively smart until I stumbled into this conversation between developers Max Porges and Barney Boisvert on the patio after the conference. I had been talking with the others but I realized when the conversation leaped 40,000 feet above my head it was time to shut up and press the record button. I managed to capture a riveting 20min debate on “how OO Coldfusion should really strive to be.” Also present were developers: Jared Rypka-Hauer, Paul Kenny, Simeon Bateman and two other guys whose names I’ve since forgotten (need that CFUNITED yearbook). If you can ignore the smooth jazz in the background and the occasional city bus that tramples the audio, it’s actually a very good debate. Feel free to chime in here and contribute your thoughts. Total Time= 18:46min. Listen to the audio here.

Tagged with:
Jul 01

Well CFUNITED is over and all I can say is “wow.” Having helped staff large conferences in the past I can vouch that it is a colossal undertaking to coordinate every detail of an event like this and Michael Smith and his crew did a superb job. The content was outstanding and the execution was flawless. I’m back at my buddy’s house in Arlington, VA with a backpack full of schwag, a wallet full of business cards and a head full of some very actionable ideas I plan to implement. The best way I think I can condense what I learned and summarize it is to do it in a 3-part series of posts over the next few days (i’m staying here so I can do DC on July 4th – i imagine that has to be the epicenter of the fiesta for this holiday). So I’ll do a braindump here on my take on the general aspects of the conference and save a summary of what I took from the sessions and hallway discussions for tomorrow.

What worked well

  1. Registration – went off without a hitch and that’s tough to do.
  2. Schedule / room allocation / logistics- great job juggling disparate topics and using the concept of “tracks” to keep attractive options for different folks at every session time. In talking w/ Michael afterwards, I learned that Teratech used online pre-surveys to determine what sessions attendees wanted in order to decide which topic would get the larger room on each time slot- genius. The breakdown/setup of the rooms itself was quite a feat to watch- the hotel staff would move these enormous partitions around and create individual rooms or collapse them to form one ginormous room for the keynotes.
  3. Speeches – the content and delivery of the presentations was top notch. I was a Joel Spolsky fan before the show having listened to his talk on ITconversations.com and followed his blog for months so it was a treat to see him from the first row doing his thing live. There was not one presentation that I regretted attending and I went to all but two of the sessions.
  4. BoF – the “Birds of a Feather” informal discussions at night worked really well. Sitting in the back of the room at the Model-Glue/Fusebox/Mach-ii discussion, I have to admit I felt a bit like being back at the 2001 World Series watching this historical interaction of all the great players in one room. To be sitting there in front of the Macromedia dev team and witness the direct honest communication with the community they serve was an honor. I swear, the mental firepower in that room…if I ever have the ducats to afford a crazy mansion, I’ll just hire that room of people instead and make amazing things. What a fun team that would be to be a part of.
  5. Ubiquitious WiFi – the conference hall area had an open connection and at least 30% of the attendees could be seen at any given time huddled near an electrical outlet checking email, posting and reading blogs. It also meant you could load the CD of presentations on your laptop and follow along while the presenter was talking or even be researching the sites and concepts while they spoke. At one point I was talking with a lady and she pointed out something which is very true: “at these conferences wifi has come to be as expected an ammenity as drinking water” and it was true. The connection had problems the first day with either oversaturation or due to the login, but the Marriot promptly opened up another connection and al was well. I did consider busting out Whoppix to see exactly how many cleartext passwords were flying around the airwaves, but i decided against it
  6. Paperback schedule guide – the thick paperback index of all speakers, topics and presentations that was distributed at registration was sheer genius. When I first got it I thought “what a waste, why wouldn’t they just leave this on CD?” But having a physical paper guide which you could flip through between sessions to review the slides of the potential presenters was just a great decision making tool. There was a few toss-ups where I wish I could have attended two sessions occurring simultaneously but the speakers were very approachable and most understood and offered to send any extra materials upon request. I think every speaker I saw left his/her email on the powerpoint for attendees to contact them if they had questions.
  7. Good feedback mechanisms – the hotel staff were meticulous in distributing and collecting surveys after each presenter and Liz summarized the results of the surveys on stage in the wrap-up session reading back comments that testified to the quality of the speakers. This kind of feedback capture (and redistribution) is critical in order to know how to adjust for future shows and reinforce the speakers with praise. I did not realize that speakers were NOT paid for their talks- like college ball “every was there for the love of the game.” Really a good vibe.
  8. Accessibility – I had a one-hour commute everyday on the metro to Arlington which meant I actually had to use an alarm clock for the first time in about a year. Whether intentional or not, it was great that they chose a hotel with a metro stop just a block away.

What was lacking (and this is a much shorter list)

  1. Depth and detail on many talks – I realize it’s difficult to present topics to an audience of varying skill levels and try to roll day’s worth of material into 50min but I found a lot of the presentations to be good overviews but lacking in depth and useable “next steps” type suggestions. On more than one occasion I found the preso getting really interesting about the time the hall monitor would step in and signal the wrapup.
  2. AZCFUG Model-Glue breezo – It’s a shame but Joe Rinehart’s breeze preso on Model-Glue just didn’t do justice to his talk the day before or even the actual live event itself. In Joe’s defense, it’s VERY difficult to juggle both a live audience and a remote one. The faces in front of you tend to take precedence. As far as the lack of planning- I think Joe had quite a bit on his plate having done an all-day class the day before (which was excellent). I checked out the breezo and quality is crappy but sh*t happens- just download the framework and tinker. I’m sure he’ll do another one soon.
  3. Conference fatigue – I don’t know how you solve this one because the alternative means you don’t cram as much valuable stuff in, but I was pretty fried by 5pm on the second day and ended up leaving a little early. You’re just barraged with so much dense valuable info and surrounded by droves of people, combined w/ the lack of sleep, it grinds you down by the end of the day. To their credit, they did have a maseuse in the hallway giving massages. I think the solution is to avoid extended stays w/ fellow developers at the bar late night…<!– READ: not really, just drink 2 redbulls after lunch –>

Suggestions for improving things

  1. Archive the follow-up discussions- by far the most “meaty” part of the talks was the 5min after the presenter had finished and the room was clearing out when a small swarm of the people that had lightbulbs going off were asking questions. This fertile discussion should be archived whether via iPod or video. I caught a fantastic 20min discussion on my iTalk out on the hotel patio afterwards amongst some really bright guys. I’ll try and get it posted here soon.
  2. Conference “Yearbook” – at my University freshman year, when we arrived on the first day, we were given essentially a pre-yearbook with pictures of our freshman class that everyone had submitted. I thought it was great because you are blitzed with all these familiar names and a lot of times your putting a face to that name for the first time. I will never forget a face but I’m terrible with remembering everyone’s name. It would be great to have a way to leaf through a document that matched the face from the conference w/ that familiar name. When I did Proscout events, we had a webacm at registration and snapped a photo and quickly named it with the participant’s ID number on their badge. It worked well and similar thing could be done here for those that wished to participate. Taking it a step further, they could integrate it into Jared’s login for his “interest matching app” and make it so you could view this online, fix your picture if you didn’t like it and even expose a webservice to display a tiny face thumbnail when people comment on blogs. Don’t underestimate the value of faces in this industry
  3. Visual Likert Scale on Feeedback forms – I screwed up on the first day and gave a bunch of 1’s and 2’s on my ratings of speakers cause (stupidly) I didn’t read the directions and thought “1” was the best rating to give. The guys at 37signals would say do something simple- a scale that looks like this:
    :-( 1 2 3 4 5 :-)
    I talked with a guy who had made the same mistake and corrected me after seeing me circle all one’s on Simon Horwith’s preso. As simple as it is, a visual likert scale would have cleared up that confusion easily.
  4. Easier way to lookup slides in paperback book by timeslot – this is a small detail but it would have been nice to have either the page number of the slides right on each timeslot in the schedule or some crossreference to make looking this up a little easier.

I’ll do a summary of all my takeaways I gleaned tomorrow. I took six pages of notes and have a lot of things I want to tinker with. Kudos Teratech and all speakers on an excellent event. Being an independent consultant, I have to pay my own way to these things but I felt I got more than my money’s worth at this event. cheers

Tagged with:
Jun 12

I have a client that requires an Enterprise Resource Planning system in order to comply with an upcoming FDA audit. In searching for an acceptable solution I ran across systems that cost hundreds of thousands of dollars (aparently some of the larger companies like Dell pay millions of dollars for their ERP systems). How is the little guy to enter the fray which such prohibitive barriers to entry bar the way? The answer I discovered was in an opensource project on sourceforge.net called “Compiere” (which apparently in Italian means “to accomplish, fulfill or deliver” – very appropriate). This particular project has consistently been in the top four active projects on sourceforge which bodes well as it has a huge developer community and is growing rapidly with plenty of new modules and features to accomodate every conceivable business need. I was stoked to find such a system that was: 1) free 2) cross-platform 3) very much alive 4) and has no shortage of consultants to help in a bind. The only problem was that it requires Oracle 10g database to run on – and everyone knows Oracle licensing ain’t cheap.

I hunted around and found something called CODAF (Compiere and Daffodil db) which sounded promising but after tinkering with it for about an hour and digging through their site grappling w/ various problems getting the database working, I realized it wasn’t truly a free solution after all so I kept up the search. Then I discovered a database called Fyracle which promised to emulate Oracle closely enough to where Compiere wouldn’t know the difference. Their install instructions for Windows were very straightforward and I ran into only a few rough spots before I had the whole thing setup and running on XP. I was tempted to try their install for linux but they were pretty involved, about 3pgs long and I’m honestly not comfortable w/ linux and the command-line yet. I gotta say both installers ran flawlessly and so far so good- I had it up and running in less than an hour.

Even with a fairly-simple install process, I thought making a video tutorial with RoboDemo would help give people a good overview to see what’s involved before actually diving into the specifics and plus it might clear up some of the minor weirdness I discovered in the docs (like having the compiere install dir default to the D: drive which on most machines is the CD-ROM). For me, learning how to use this ERP system is the next step in this project but if the Compiere user guide is anything like install documentation I’ve seen so far, it should be fairly smooth to pickup. I would really like to see some tutorials like this one made by some of the more experienced Compiere users to show how the software is used for common daily tasks like inventory control, supply chain tracking, accounting integration and CRM functions. Perhaps someone at Compiere will dig this method of teaching and adopt this idea of producing some video tutorials. Enjoy!

Watch video tutorial on Compiere ERP setup on Fyracle databse and Windows XP in five minutes
7.1MB ~5min

PS. I know my blog is now syndicated from a couple different sources ranging from ColdFusion to legal technology-related. I generally post about whatever “ah-ha” moments I have and stuff that I find useful and interesting to myself. I have my feet in both the legal and CF swimming pools – hopefully peripheral topics like this one do not alienate either camp by falling on the side of being too technical or not technical enough. My rule of thumb for writing here has always been “stuff I wish someone would have told me in the first place” and that continues to be the compass by which I align my postings here.

Tagged with:
Jun 02

I’m pretty sure that K2 was that insane mountain that John Cusack tried to ski down in Better Off Dead (“have you any idea of the street value of this mountain it’s pure snow!!”). Well the “street value” of Verity K2 is no less amazing. It’s not often you have the opportunity to improve your application’s performance by10,000% through an hour’s-worth of work. In migrating an application to a new server tonight I was reminded of just how much better verity k2 server is than the standard VDK version that comes bundled with CF and yet not everyone is utilizing this feature. VDK apparently is a file-based approach for indexing while K2 is a server-based approach to do the same thing. If you’re on windows you can (and should) set it up to run as a service. I would say anyone running CF enterprise who is doing any type of searching with verity would strongly benefit from the hour or so investment it takes to get going w/ K2. There is a good tutorial here that walks you through the process of switching modes. I’m using Verity now to index a giant application-scoped query of 5000 contacts against which I compare a user-submitted list of names in my Sentinel Application. My app literally performs 100x faster with K2 than it did with the vanilla install of verity. Just look at the speed comparison using the getTickCount() function:

The only gotcha I ran into in migrating this app to a new server is that you can’t create a k2 index without first creating your standard verity collection. Once you have your regular VDK collection running successfully, follow the steps in the tutorial above to make it hum w/ K2. Also, when moving from one server to another, simply copying the collection files does not work, you need to create the collection through the administrator from scratch. Performance gains on the order of 20% are generally impressive but 100-fold improvements are unheard of. Chalk one up for CF.

Tagged with:
May 23

I was roped into this chain letter by Rob Brooks-Bilson :-) My contribution to the meme:

Total Volume (of my MP3 library): 74.3 GB

Last CD Bought: “Garden State Soundtrack ” – Various Artists

Song Playing Right Now: “Unsung” – Helmet

Five Songs I Listen to a Lot:

  • Boombastic Radio shoutcast stream
  • Flying Horses – Dispatch
  • Trip Like I Do – Crystal Method (Filter Remix)
  • Lucky Denver Mint – Jimmy Eatworld
  • Collide – Howie Day

Five People to Whom I’m Passing the Baton:

May 19

With the miracles of modern technology, this morning at 35,000ft I was able to get a full install of Fedora Core 3 running on my new laptop under Virtual PC. This Inspiron 9300 I got is nothing short of amazing- i tricked it out w/ 2 GB of RAM earlier this week and w/ the 2.0 Pentium M chip, this thing just smokes. Hats off to Virtual PC too – this program is amazing, software that can emulate a machine to the point where the installed OS THINKS it’s running on a real box. I got to thinking about the possibilities of virtual pc’s running other virtual pc’s but it became a bit too “malkovitch malkovitch” in trying to grasp the existential implications of this type of virtual recursion so I let it go and screw around with linux instead.

The Fedora install is part of a larger goal I have right now of getting Subversion source control running first locally on my laptop and then on my buddy’s linux box. I will say that getting Fedora on VPC is not without its headaches- first you gotta get the ISO’s which are a full 2.3GB in all. I found this torrent tracker hosted at Duke University and was able to use BitTorrent to get about 250Kbps on my cox@home connection. Once you have the ISO’s, you can walk through the wizard for creating a new virtual machine in VPC. You then use the CD > Capture ISO option to fool the virtual machine into thinking it’s got the CD-ROM in its drive. So far so good- the rest of the Fedora install is very clean and they have a nice GUI wizard interface to customize it for your needs. There is one major hack however you must apply to the kernal in order to get it to recognize your monitor. The first time I went through it, it installed flawlessly but when I rebooted it threw an error saying “ID 2 Respawning too quickly – shutting down for five minutes.” After some serious digging in various forums I found this post that contains the necessary set of instructions (56 individual steps in all) to get it to work (and I can confirm that it does in fact work). The crux of the issue apparently stems from the OS’s inability to recognize the monitor of a virtual machine, you have to explicitly set this using some linux commands and installing a kernal modification. Once that was done, the intstall worked perfectly.

On another note, tomorrow is my brother’s wedding. I’m sitting here in my room at the Marriot in Monterrey watching seagulls out my window (which is an absolute novelty for us Arizonans) and trying to come up with a creative slideshow that will embarrass the hell out of him at the rehearsal dinner tomorrow night. I snapped this pic on my treo and stored it to the SD card:

What’s cool is my laptop has a built-in SD reader so transferring pics is simple- you pop in the card and it runs a quick sync tool and maintains a folder on your harddrive called PalmOne that has everything your card has. From my room I can see no less than twelve separate wireless networks on my laptop (about half of which are unencrypted) but apparently all have mac address filtering turned on or are just too weak a signal as packetloss is about 75%. I have this nifty program called PDAnet for my treo that turns it into a modem for my laptop. It gets decent speeds too (about comparable to a 56k modem) and I hear the Sprint network can do about twice that (I’m on Cingular). Anyways, I will definitely post my experience in getting Subversion to work as it seems like quite a process from their site. Looking forward to this weekend. In a span of three weeks it’s been Connor’s bachelor party, followed by my 30th b-day, followed by his wedding – June is shaping up to be a month of detox for sure!

preload preload preload