A Day in the Life

A day in my life. Thoughts on leadership, management, startups, technology, software, concurrent development, etc... Basically the stuff I think about from 10am to 6pm.

2/27/2006

Digipede: Distributing Excel Computations – Part 4

Now that I’ve run my calculations I’m ready to return the results to the Master workbook. As I said at the end of Part 3 I’m going to save my results in to a results workbook and send that workbook back to the Master workbook. This means defining another FileDef in my Master workbook and since I want my results in a different directory than my input files, I create a new RemoteLocation as well.

Updates to Excel Master with VBA code-behind: (New code bolded)
. . .
' Create a RemoteLocation for the Worker Script
Dim location As Digipede_Framework.RemoteLocation
Set location = New Digipede_Framework.RemoteLocation

location.ID = 0
location.location = fileStartLoc
location.ProtocolType = Digipede_Framework.ProtocolType_Share
jobTemplate.RemoteLocations.AddItem location

' Create a RemoteLocation for the Results files
Set location = New Digipede_Framework.RemoteLocation
location.ID = 1
location.location = fileResultsLoc
location.ProtocolType = Digipede_Framework.ProtocolType_Share
jobTemplate.RemoteLocations.AddItem location

.
.
.
' Create a filedef for the Data workbook
Dim dataWBFileDef As Integer
Dim strInputDataFile As String
strInputDataFile = "InputDataFile"

Set fileDef = New Digipede_Framework.fileDef
fileDef.ID = 2
dataWBFileDef = fileDef.ID
fileDef.RemoteLocationId = 0
fileDef.Name = strInputDataFile
fileDef.Relevance = Digipede_Framework.Relevance_InputPlaceholder
jobTemplate.FileDefs.AddItem fileDef

' Create a filedef for the Results workbook
Dim resultsWBFileDef As Integer
Dim strResultsDataFile As String
strResultsDataFile = "ResultsDataFile"

Set fileDef = New Digipede_Framework.fileDef
fileDef.ID = 3
resultsWBFileDef = fileDef.ID
fileDef.RemoteLocationId = 1
fileDef.Name = strResultsDataFile
fileDef.Relevance = Digipede_Framework.Relevance.Relevance_ResultPlaceholder
jobTemplate.FileDefs.AddItem fileDef

.
.
.
Dim strResultFile As String

' Create a job
For i = 1 To 5
Set oTask = New Digipede_Framework.Task

' Set up the data file transfer
strDataFile = "Data" + Application.WorksheetFunction.Text(i, "00") + ".xls"

Set fileDef = New Digipede_Framework.fileDef
fileDef.ID = dataWBFileDef
fileDef.RemoteLocationId = 0
fileDef.LocalName = "Data.xls"
fileDef.RemoteName = strDataFile
fileDef.Name = strInputDataFile
fileDef.Relevance = Digipede_Framework.Relevance_Input
oTask.FileDefs.AddItem fileDef

' Set up the Results file transfer
strResultFile = "Result" + Application.WorksheetFunction.Text(i, "00") + ".xls"

Set fileDef = New Digipede_Framework.fileDef
fileDef.ID = resultsWBFileDef
fileDef.RemoteLocationId = 1
fileDef.LocalName = "Result.xls"
fileDef.RemoteName = strResultFile
fileDef.Name = strResultsDataFile
fileDef.Relevance = Digipede_Framework.Relevance_Result
oTask.FileDefs.AddItem fileDef


Dim taskParameter As Digipede_Framework.parameter
Set taskParameter = New Digipede_Framework.parameter
taskParameter.Name = parameter.Name
taskParameter.Value = i
oTask.Parameters.AddItem taskParameter

mJob.Tasks.AddItem oTask
Next i


The changes to the Master are all that is required to move the Results file. But
there are some important issues that need to be addressed in how the Worker saves the Results.xls file.

By default Excel will save a file in the last specified save directory. This means that you must save the Results file using a fully qualified path. Since we set the Application.DefaultFilePath in the RunWorker script we use that to save the Result file in the location the Digipede Network is expecting to find it.


ResultFileName = Application.DefaultFilePath & "\Result.xls"
.
.
.
oResultBook.SaveAs Filename:= ResultFileName
oResultBook.Close False
Set oResultBook = Nothing


The last thing we need to do is to make sure that the Results directory specified in the Master variable fileResultsLoc allows the Digipede Network to write to it. If your Results directory is not a top level directory make sure that you change the permissions for the top level directory.

In Windows Explorer right-click on the Results directory, select ‘Properties’, then the ‘Sharing’ property page. On the ‘Sharing’ page select the ‘Permission’ button, and on the ‘Permissions for Results’ property page check ‘Full Control’. Select the ‘OK’ button and then the ‘OK’ button again.

We have one last piece to do to finish the development part of this project and that is to pull all the Result files together. In Part 5, I will show you how the Digipede Network notifies the Master application when a Job is complete and how the user can use that notification to process the results.

Essay Links:

Start
Automation
Part 1
Part 2
Part 3
Part 4
Part 5

Labels:

Digipede: Distributing Excel Computations – Part 3

In Parts 1 and 2 I showed you how to move the Worker workbook, the RunWorker script, and the Data workbook to the compute nodes. In Part 3 I’m going to show you how to start the computation in the Worker workbook from the RunWorker script.

I’m going to let the Worker workbook open the Data workbook itself. The RunWorker script could open the Data workbook and populate Worker but this is an Excel sample, so I’m going to let Excel do all the heavy lifting.

RunWorker must make an Excel VBA call to cause the Worker to begin work. You have two choices: Calculate () or Run (). I recommend staying away from Application.Calculate () because if the Worker is running on a work station that already has other Excel workbooks open, the Application.Calculate () will initiate calculations on those other workbooks. We don’t want that to happen. It breaks one of the first rules of software development which is: Play well with others. If we’re lucky there will be a Workbook.Calculate () in Excel 12, but it’s not available now and our safest bet is to stick to either Application.Run () or Worksheet.Calculate().

Generally if your Worker workbook calculations were being initiated by a subroutine then you would call Application.Run () from the RunWorker script. If you were initiating calculations by pressing F9 on Excel then you will need to identify the worksheet hierarchy and call Calculate() on the top most worksheet, the calculation will then ripple through the workbook.

I’m going to use Run () in this example because all of the parallelizable Excel workbooks I’ve seen so far have had code-behind.

The reason that Part 3 has take so long to get out has been do to the trouble I’ve had figuring out the format the macro name needs to take for the Run () to execute. I thought I had understood it but I find that I understand it much better now! The macro name sent to Run () is made up of three parts: the filename, the Excel Object name, and finally the macro name.

To get the names of the macros available in an Excel workbook:
    1. Open the workbook in Excel

    2. From the Menu Bar select ‘Tools’ then select ‘Macro’ and finally select ‘Visual Basic Editor’ – these steps open Microsoft Visual Basic with the code-behind for this workbook.

    3. Within Visual Basic from the Menu Bar select “Tools” then ‘Macro’. You now see the list of macros available within the workbook for external reference. If the macro you want to call is not listed there, you must redeclare the macro as Public. The format of the macro in the list box is the second and third parts of the macro name that Run () needs. The first part of the macro name is the fully qualified path and name of the workbook (without the .xls extension). It is a safe practice to put single quotes around the fully qualified filename incase there are spaces somewhere in the name.
Update RunWorker Launch Script:

.
.
.
WBook.Sheets(“Automate”).Range(“O3”) = TaskId

Dim strMacroName
strMacroName = "'" & strPath & "\Worker'" & "!Sheet1.btnStartCalc_Click"
myExcelWorker.Run strMacroName
.
.
.



Once the calculation is complete you have the option of storing the results in a
database or a results file. I’m going to store my results in a result workbook and send that back to the Master since that demostates Digipede specific functionality. I’ll show you how to do that in Part 4.

Essay Links:

Start
Automation
Part 1
Part 2
Part 3
Part 4
Part 5

Labels:

2/23/2006

Links: February 23, 2006

So much left to explore here on Earth:
Ghost Cave
Unexplored Paradise
Unidentified criter found with some shrimp

Good programming tips:
http://www.irishdev.com/NewsArticle.aspx?id=1857

New tool to help bloggers move to the A-List?
BlogBurst

What’s up with this?
Patent granted for Internet Rich Applications? Why! How!

2/22/2006

Digipede: Distributing Excel Computations – Part 2

Yesterday I created a JobTemplate that moved template level files to the compute nodes. (The Worker workbook and the RunWorker script) Today I’m going to solve the problem of telling the Worker what to do. The value of a Worker is that it allows you to parallelize the execution of your spreadsheet. In the Digipede Network a Job is made up of Tasks and each Task sets up a parallelized version of your computation.

Usually each computation requires different “seed” data. With Excel we can use several different techniques to get the seed data from the Master to the Worker.

In Part I you saw code in the RunWorker script that put a value into a cell on a worksheet in the Worker workbook. I created a Digipede Parameter in the Master workbook to send in a seed value to the RunWorker script. I then wrote the value to the Worker workbook so that I could see that the RunWorker script executed. I can create as many Parameters as I want, but beyond a certain number things will get unwieldy. If you want to use Parameters with Excel you have no choice but to use some other application or script to launch the Worker workbook because Excel does not pass command-line parameters to the workbook.*

When you have large amounts of data to send to the Worker, you can use a seed file, such as another Excel workbook, or you can extract the data from a database. Since I already showed you how to send a Parameter to a Task, I’m now going to show you how to send a seed file. Accessing a database doesn’t require anything Digipede specific. You won’t be moving any files and the key(s) that you’ll need for your search can be sent in as a Parameter(s).

Updates to Excel Master with VBA code-behind: (New code bolded)

' Create a filedef for the Worker Excel file
Set fileDef = New Digipede_Framework.fileDef
fileDef.ID = 1
fileDef.remoteLocationId = 0
fileDef.LocalName = "Worker.xls"
fileDef.Name = "Worker.xls"
fileDef.Relevance = Digipede_Framework.Relevance_JobTemplate
jobTemplate.FileDefs.AddItem fileDef

' Create a filedef for the Data workbook
Dim dataWBFileDef As Integer
Dim strInputDataFile As String
strInputDataFile = "InputDataFile"

Set fileDef = New Digipede_Framework.fileDef
fileDef.ID = 2
dataWBFileDef = fileDef.ID
fileDef.RemoteLocationId = 0
fileDef.Name = strInputDataFile
fileDef.Relevance = Digipede_Framework.Relevance_InputPlaceholder
jobTemplate.FileDefs.AddItem fileDef

.
.
.
Dim strDataFile As String

' Create a job
For i = 1 To 1
Set oTask = New Digipede_Framework.Task

strDataFile = "Data" + Application.WorksheetFunction.Text(i, "00") + ".xls"

Set fileDef = New Digipede_Framework.fileDef
fileDef.ID = dataWBFileDef
fileDef.RemoteLocationId = 0
fileDef.LocalName = "Data.xls"
fileDef.RemoteName = strDataFile
fileDef.Name = strInputDataFile
fileDef.Relevance = Digipede_Framework.Relevance_Input
oTask.FileDefs.AddItem fileDef


Dim taskParameter As Digipede_Framework.parameter
Set taskParameter = New Digipede_Framework.parameter
taskParameter.Name = parameter.Name
taskParameter.Value = i
oTask.Parameters.AddItem taskParameter

mJob.Tasks.AddItem oTask
Next i


To move a Task level file you must create two FileDefs for it. The first FileDef
belongs to the JobTemplate and lets the JobTemplate know that there will be Task level files to move. The second FileDef belongs to the Task and identifies the file to move.

' JobTemplate level FileDef
Set fileDef = New Digipede_Framework.fileDef
fileDef.ID = 2
dataWBFileDef = fileDef.ID
fileDef.RemoteLocationId = 0
fileDef.Name = strInputDataFile
fileDef.Relevance = Digipede_Framework.Relevance_InputPlaceholder
jobTemplate.FileDefs.AddItem fileDef


Notice that the new JobTemplate FileDef has a Relevance of InputPlaceholder.
Whenever you create a JobTemplate FileDef with the Relevance of InputPlaceholder, the Task definition must contain an associated FileDef object.

' Task level FileDef
Set fileDef = New Digipede_Framework.fileDef
fileDef.ID = dataWBFileDef
fileDef.RemoteLocationId = 0
fileDef.LocalName = "Data.xls"
fileDef.RemoteName = strDataFile
fileDef.Name = strInputDataFile
fileDef.Relevance = Digipede_Framework.Relevance_Input
oTask.FileDefs.AddItem fileDef


I did a little trick in the creation of the Task’s FileDef in that I changed the
name of the data file. I told the Digipede Network that I wanted to move a file named “Data##.xls”, but when my file gets to the compute node I want it renamed to “Data.xls”. This makes it much easier for my Worker workbook because I don’t have to tell the Worker the name of the input data file.

Because template level files are expected to be the same for all the Tasks, you have the option to leave those files on the compute node so that you don’t have to move them again. This comes in handy when debugging and is one element of a technique I will discuss later that can save your network bandwidth. The behavior is a little different for Task level files. Task level files are expected to be different for each Task. Therefore they are deleted when a Task completes.

I have also added code to validate the JobTemplate before I submit it to the Digipede Network. This is a good code snippet to have. Because VBA doesn’t allow exceptions using Validate() is your best technique for identifying malformed JobTemplate problems.

' Validate the job before submitting to get validation information.
Dim valItems As Digipede_Framework.ValidationItemCollection
Dim valSeverity As Digipede_Framework.ValidationSeverity

valSeverity = jobTemplate.Validate(mJob, valItems)
If valSeverity = ValidationSeverity_Error Then
errMsg = "Validation Error Count: " & valItems.Count
For i = 1 To (valItems.Count)
errMsg = errMsg & "\r\n " & valItems.Item(i).Message
Next i
MsgBox errMsg
Exit Sub
End If

If valSeverity = ValidationSeverity_Warning Then
errMsg = "Validation Warning Count: " & valItems.Count
For i = 1 To (valItems.Count)
errMsg = errMsg & "\r\n " & valItems.Item(i).Message
Next i
MsgBox errMsg
Exit Sub
End If

Sheet1.Range("SubmissionTime") = Format(Time, "hh:mm:ss")

ThisWorkbook.mClient.SubmitJob poolId, jobTemplate, mJob,
Digipede_Framework.SubmissionOptions_None



At this point the input data file is being copied to the compute node where either
Worker or RunWorker can use it. Part 3 will explore a few techniques for initiating the computation in the Worker.

* As of Excel 2003

Essay Links:

Start
Automation
Part 1
Part 2
Part 3
Part 4
Part 5


Labels:

More on Brain Candy...

I asked the other day if anyone knew of any mentally stimulating websites mainly because it makes my brain feel happy when I see new and interesting things. Little did I know the importance of external stimulation in the creation of new brain stuff. You can read more about this here and here. Send those juicy sites my way!

Technorati tags: ,

2/21/2006

Links: February 21, 2006

I’m thinking spy movie: http://www.physorg.com/news11009.html

This might explain the insane need to help a stranger in trouble: http://news.bbc.co.uk/2/hi/science/nature/4729050.stm

Interesting place to experience: http://www.indystar.com/apps/pbcs.dll/article?AID=/20060129/LIVING03/601290341/1007/LIVING

And a new grid computing paper: http://www-unix.gridforum.org/mail_archive/ogsa-wg/2006/02/msg00032.html

Digipede: Distributing Excel Computations – Part I

"Yes Virginia, you can run an Excel spreadsheet on a compute grid." It’s true. But you can’t run just any spreadsheet and there is no magic bullet. You will have to do a little work. Something, a Master, has to initiate the work and something, a Worker, has to do the work. Whether the initiator is an Excel spreadsheet or a custom application doesn’t matter, whether the work is an Excel spreadsheet, a DLL, or an executable doesn’t matter. All that matters is the relationship between the Master and the Worker.

For this code sample I’m going to use an Excel Master and I’m going to distribute an Excel Worker. I’ve created a closed system on my machine (I have a Digipede Server, Agent, and Framework all installed on one machine) to simplify my testing environment. Remember when building a distributed application to create your simplest starting case first and then to expand as you verify each phase.

The first thing to do is make sure that I correctly tell the Digipede Network what files to move and how to initiate the Worker spreadsheet. I then will test everything on the compute node.

Excel Master with VBA code-behind:
Private Sub btnStartCalculations_Click()

' Perform validation – we need a valid user id, password and Digipede host address
Dim errMsg As Variant
Dim userId As String
Dim password As String

userId = Sheet1.Range("UserId")
password = Sheet1.Range("Password")
If userId = "" Or password = "" Then
errMsg = "Enter a Digipede Network user id and password."
MsgBox errMsg
Exit Sub
End If

Dim hostName As String
hostName = Sheet1.Range("HostName")
If hostName = "" Then
errMsg = "Enter a Digipede Server name."
MsgBox errMsg
Exit Sub
End If

Dim fileShareLoc As String
fileShareLoc = Sheet1.Range("FileShareLocation")
If fileShareLoc = "" Then
errMsg = "Enter the file share location of the files to distribute."
MsgBox errMsg
Exit Sub
End If

' Depending on the version of the Digipede Network
' you may want to setup pools
Dim poolId As Long
poolId = Sheet1.Range("PoolId")
If poolId = 0 Then
poolId = 1
End If

Dim strLaunchScript As String
strLaunchScript = "RunWorker.vbs"

' Disable the Calculate button
'btnStartCalculations.Enabled = False
Sheet1.Range("SubmissionTime") = ""
Sheet1.Range("FinishedTime") = ""
Sheet1.Range("JobId") = ""

' Initialize the client
ThisWorkbook.mClient.SetCredentials userId, password
ThisWorkbook.mClient.SetUrlFromHost hostName

' Create the JobTemplate
Dim jobTemplate As Digipede_Framework.jobTemplate
Set jobTemplate = New Digipede_Framework.jobTemplate

' I set DiscardAfterUser to False because I’m debugging,
' I plan to move some files to the compute nodes and I
' need to manually confirm that after the move everything
' will work the way I expect.
jobTemplate.DiscardAfterUse = False

' Create a RemoteLocation for the Worker Script
Dim location As Digipede_Framework.RemoteLocation
Set location = New Digipede_Framework.RemoteLocation

location.ID = 0
location.location = fileShareLoc
location.ProtocolType = Digipede_Framework.ProtocolType_Share
jobTemplate.RemoteLocations.AddItem location

' Create a filedef for the launch script – the filedef
' defines a file to be moved by the Digipede Network.
Dim fileDef As Digipede_Framework.fileDef
Set fileDef = New Digipede_Framework.fileDef
fileDef.ID = 0
fileDef.remoteLocationId = 0
fileDef.LocalName = " RunWorker.vbs"
fileDef.Name = " RunWorker.vbs"
fileDef.Relevance = Digipede_Framework.Relevance_JobTemplate
jobTemplate.FileDefs.AddItem fileDef

' Create a filedef for the Worker Excel file
Set fileDef = New Digipede_Framework.fileDef
fileDef.ID = 1
fileDef.remoteLocationId = 0
fileDef.LocalName = "Worker.xls"
fileDef.Name = "Worker.xls"
fileDef.Relevance = Digipede_Framework.Relevance_JobTemplate
jobTemplate.FileDefs.AddItem fileDef

' Create one parameter placeholder
Dim parameter As Digipede_Framework.parameter
Set parameter = New Digipede_Framework.parameter
parameter.Name = "InputText"
parameter.Relevance = Digipede_Framework.Relevance_InputPlaceholder
jobTemplate.Parameters.AddItem parameter

jobTemplate.Control.CommandLine = "cscript RunWorker.vbs $(InputText)"
jobTemplate.Control.SaveStandardError = False
jobTemplate.Control.SaveStandardOutput = False
jobTemplate.Control.UseShellExecute = True

jobTemplate.Name = "ExcelSubmitterTest1"

' now the JobTemplate is complete; create a job with tasks.
Set mJob = New Digipede_Framework.job
mJob.Name = "ExcelTest1"

Dim i As Integer
Dim oTask As Task

' Create a job but only put one Task in. I want to make
' sure that everything works for one Task before I
' add more.
For i = 1 To 1
Set oTask = New Digipede_Framework.Task

' Later I’m going to passing along an input data file.
' Right now I’m just passing in a parameter that will
' eventually be used to build the name of that file.
Dim taskParameter As Digipede_Framework.parameter
Set taskParameter = New Digipede_Framework.parameter
taskParameter.Name = parameter.Name
taskParameter.Value = i
oTask.Parameters.AddItem taskParameter

mJob.Tasks.AddItem oTask
Next i

Sheet1.Range("SubmissionTime") = Format(Time, "hh:mm:ss")

ThisWorkbook.mClient.SubmitJob poolId, jobTemplate, mJob,
Digipede_Framework.SubmissionOptions_None

Sheet1.Range("JobId") = mJob.JobId

End Sub



Notice that I am distributing a VBS script. I have yet to figure out how to get
Excel to close properly on a remote machine without a script. Code that opens and closes Excel on my machine doesn’t work from a remote call. So for now I invoke the Worker using a script. This actually works fine for this use case because I need to initialize the Worker spreadsheet anyway.

Launch Script to launch the Worker spreadsheet:
' Create a WshShell to get the current directory
Dim WshShell
Set WshShell = CreateObject("WScript.Shell")

' Create an Excel instance
Dim myExcelWorker
Set myExcelWorker = CreateObject("Excel.Application")

‘ Disable as much UI as possible.
myExcelWorker.DisplayAlerts = False
myExcelWorker.AskToUpdateLinks = False
myExcelWorker.AlertBeforeOverwriting = False
myExcelWorker.FeatureInstall = msoFeatureInstallNone

' Set this so that the Excel files can be found and saved
Dim strPath
strPath = WshShell.CurrentDirectory
myExcelWorker.DefaultFilePath = strPath

Dim objArgs
Set objArgs = WScript.Arguments

' Open the Workbook specified on the command-line
Dim WBook
Dim strWorkerWB
strWorkerWB = strPath & "\Worker.xls"
Set WBook = myExcelWorker.Workbooks.Open(strWorkerWB)

' This is my test code, I want to update a cell so that I know that I
' opened the spreadsheet and did something.
Dim TaskId
TaskId = objArgs(0)

WBook.Sheets("Automate").Range("O3") = TaskId

WBook.Save

' Clean up and shut down
WBook.Close
Set WBook = Nothing
Set objArgs = Nothing
Set myExcelWorker = Nothing
Set WshShell = Nothing



Notice in the code that I build a path to the Worker workbook. I need to do this
because of how Excel works, if I don’t use a path then Excel looks in the wrong directory and doesn’t open my workbook.

It should also be noted that neither the Worker workbook nor the Launch script should require user interaction. Anything that runs on a compute node will not complete execution if user feedback is required or message boxes pop up. So make sure you have removed all UI elements.

Test it:

We’re now ready to test. As I stated earlier I run all initial testing on a closed system. This just makes it easier for me to access everything I need. Here are my testing steps:

1. I open the Master Excel spreadsheet and press ‘Start’. This action causes the files I need to be copied to my compute node. In a closed system this results in the files being placed in a specific subdirectory on my machine.
2. Once the Master returns I open Digipede Control and get my JobId for that test.
3. I then open a Command Prompt and go to the directory: “C:\Documents and Settings\All Users\Application Data\Digipede\Agent\Data”. The subdirectory I want from here is JobId plus “v1”. So in this case I will go to “15v1” because my JobId is 15.
4. I look in the subdirectory to see if the files I was expecting are there. Yep, looks good.

The code above is the working code but it didn’t work in its first incarnation. Debugging was required. To debug what is happening on the compute node, I ran the RunWorker.vbs file from the command-line with the following call:

cscript /x runworker.vbs 1

I then modified the script until it worked. When I had a working script I replaced the original copy of RunWorker.vbs with the working copy.

I now know that the files are being moved and that my script is launching and initializing the Worker correctly. The next step will be to add an input data file and I will cover that in Part 2.

Essay Links:

Start
Automation
Part 1
Part 2
Part 3
Part 4
Part 5

Labels:

2/16/2006

Anyone have any brain candy?

There are some blogs that I just have to read. Boing Boing is like candy for the brain. I need it and now I need the Dilbert Blog. The thrill I get from both of these blogs is the same thrill I get when I watch “Outer Limits”. I never know what might happen but I know it won’t be what I expect. Thanks.

Anyone know of any similar blogs?

Technorati tags: ,

Grid Computing: Digipede Update Feb 2006

Cool things are happening here at Digipede. Since the last time I did a Digipede update here’s what’s new:

- The Digipede Network was named a finalist for a 2006 CODiE Award in the Distributed Computing Solution category.The awards will be announced on May 16th.


- We added another case study. This time for a company that was trying to scale their application using multithreading. They found that adding multithreading to their application was costly and time consuming, so they turned to Digipede for a solution. You can read the case study here.

- We’ve been working with 4th Story to help them speed up some of their computations. Our relationship was announced yesterday. This is exciting because it is the first commercial product to support grid computing using the Digipede Network. You can read about it here. I liked this quote from 4th Story’s CEO Steve Smith,

"Our software runs on Windows, and we’ve adopted .NET as our development platform; Digipede’s .NET support is far superior to anything else we’ve seen and it has significantly reduced our development time. Without re-architecting our solution, we brought a grid-enabled product to market in a matter of weeks instead of months, allowing us to offer our customers the performance and scalability they need. As our mutual customers’ needs continue to evolve, we look forward to more collaboration with Digipede."
- We just partnered with HP and we are looking forward to more official announcements later this year.

- Digipede got a mention on Greg Nawrocki’s Grid Meter blog on InfoWorld you can read that here.

- And if you missed the Software Development Magazine review, “That Parallel Beat”, you missed the fact that we got a 4 star rating. Most of what kept us from a 5 star rating was already in development. We are pretty excited about that.

It’s been a good start to the year. We just have to keep getting the message out that grid computing isn’t just for huge companies and that grid computing can be easy. With the Digipede Network everyone can get in the game.

2/14/2006

Software: Rearchitecture Sample and How to Tips

I discussed in my previous post the value of separating the UI from the business logic. In this post I’ll give you some suggestions on how to do that.

It’s always easier to get management to sign-off on a large development project if you can demonstrate increased market opportunities, monetary savings, and improved revenue opportunities. But execution from engineering is critical. I’ve seen several rearchitecture projects that went bad because the person assigned the job of analysis didn’t really understand how to take the application apart and put it back together again. So the companies wasted a lot of money on a project that brought them NO tangible returns.

I’ve been involved with several successful product rearchitectures and I think the technique that works best is to assign a person to look seriously at the code and identify ALL the places that need to be changed. This can take months depending on the size of the application. I’ve never done this with an entire team doing the review but that should work if well managed. Make sure that all the places that need to change are identified and that there is a solutions plan that the engineers agree on. Do not take any action unless you understand the entire scope of the project. This is an engineering fix and shouldn’t break the application.

To rearchitect a product you have to understand what your objectives and limitations are. You have to understand both the technical and business aspects. These are your constrainers and they define what you have to work with. Once you’ve identified your box, you then need to identify what the real problems are. And once you know all that you have the knowledge you need to devise a rearchitecture plan. A good rearchitecture plan will reuse as much code as possible. Don’t reinvent the wheel, use what you already have. By doing this you can finish the rearchitecture faster with a higher level of confidence in the code base.

I think that it is vital that an architect not only understand the technology, she must also understand the business. A few years ago I joined a project that was two years late with no foreseeable end in sight. This was at a gaming company that made casino games. So embedded system, written in C, no nice debugger, EPROMs, and a home-grown OS. I was the only trained software engineer on the team, the rest of my colleagues were hardware guys. My boss basically said, “Get us out of this mess.” I spent the first 2 months fixing bugs and adding support for a UART and the configuration screen. This was time I spent familiarizing myself with the development environment, the code base, and the over all product architecture. I then spent the next 2 months identifying where the architectural flaws were and devising a two-phase plan (we could not have any downtime because the company was on the market). I then briefed the rest of the team on how we were going to fix everything. Seven months after I started, the product was stable and had been approved by the Gaming Commission for release.

I’ve seen an industry trend towards hiring architects that have no practical coding experience and I think this is a huge mistake. I see architecture as the place where technology and business meet. An architect has to have the ability to weigh the technical benefits and downfalls vs. business benefits and downfalls. Without practical product development experience I don’t believe a person can do that.

If you are considering a rearchitecture/refactoring project then make sure you have the right person doing the analysis. Choosing the right person can make you a hero and rain good things down on your company. Choosing the wrong person can make you a schmuck without a job.

Software: Divorce your UI

Over the last few years I’ve seen a couple instances where separation of the UI from the business logic would have really given a company a competitive edge. Now that I’m looking into web services and seeing the mobile market take off it really hits home how very important this is.

Let’s start with the question, “What is a UI?”

A user interface (UI) is the part of a program that a person or machine interacts with.

This can be a command-line interface, a GUI, or a web service. I’m going to explore the benefits of the different UI options:

Command-line UI
Adding a command-line interface can:

• Provide your QA department a way to test basic program functionality in an automated fashion. Once an automated test environment is set up, it can run overnight unattended, regression test existing code, and basically saving a lot of time and money.
• Provide customers a way to run your application in a batch process which opens up the possibility of running your application on a grid.

Graphical User Interface
Here are some GUI thoughts:

• Windows based UI provides a way to graphically interact with the business logic through a client.
• Browser based UI that opens your product up to other operating systems and remote access.
• Mobile UI that can make your product available in a whole new way.

Web Service
Adding a web service interface can:

• Provide the QA testing benefits of a command-line UI.
• Open your application up to other companies to build products that use your data and processes. Perhaps creating partnering and/or service opportunities.

So if you managed to hang with me this far then I hope that this got the wheels turning in your head. Ask yourself, “How separate is my business logic from my UI logic?” If the answer is “Very” then perhaps you should consider adding some of the other UI options. If the answer is “Not” then perhaps you should consider starting a refactoring project. By creating separation you empower your company to more quickly respond to change. Faster time to market, increased productivity, and improved product quality can all be yours by separating the code.

Technorati tags: , , ,

brrreeeport! Baby it's cold outside!

Robert Scoble started a test of the blog search engines. Well I like to help. So here is my "brrreeeport" text.

Technorati tags: ,

Software: Web Service for Beginners

I just finished Digipede-enabling a web service so that we would have a working sample of a web service use case. I have bigger plans for Digipede behind a web service but I have to start somewhere and the first place to start is with learning the technology. So I started with a simple Monte Carlo Pi web service. (I should have probably started with Hello World, but I’m a little over confident.)

Creating a web service and testing it was extremely straight forward. This was quite a surprise. I was able to get everything up Monday morning between 3am and 4am. (Sometimes I can’t sleep...what can I say?)

I used a MonteCarloPiLibrary DLL (which contains the Monte Carlo Pi algorithm) that I have hanging around from other projects and wrapped a web service around that. I created a web service function double GetPi(int numTasks, int numDraws).

The web service acts as a Master application so I wrote the 20 lines of code needed to create the Tasks and distribute the work. But I was having problems. I would send the work out to the Digipede Server but my call to WaitForJob() was not returning. I finally figured out today that I don’t need the RaiseEventsOnCallingThread set to true. Removing the RaiseEventsOnCallingThread property fixed my problem.

Here are a few things I learned:

- I created a console application (TestWS) that made a call to the web service. It took me a little while to figure out that to access the web service class I had to do Add Web Reference and add the web service to the TestWS project. It’s strange to access a type that isn’t clearly defined in the code. Here is the code for TestWS:

static void Main(string[] args) {

// Get Pi
localhost.MonteCarloPiService myService = new localhost.MonteCarloPiService();
double piRet = myService.GetPi(1, 1000);
Console.WriteLine("piRet = {0}", piRet);
Console.ReadKey();
}


I don’t have a clear understanding of localhost yet (as defined in the code above); it looks like the text can be whatever you want. This clearly becomes one of those things you want to have a good naming convention for.

- When debugging TestWS I had to step into the web service. When I set a breakpoint in the web the debugger didn’t stop. So set your break on the web service call and then step in.

- If you get the VS2005 error message, “Unable to automatically step into the server. The remote procedure could not be debugged...”. You need to change the web.config file on the web service so that the tag property is set to true. <compilation debug=”true”>. Thanks Harish.

- I didn’t need to create a test console application because running the web service with the VS2005 provided starting page would have stopped program execution at a breakpoint in the web service. But I when I first tried that I didn’t have the compilation property in web.config set correctly. This was not clear when I started so I did what everyone does and fell back to the stuff I know.


I did this initial development with a Digipede Server and Agent all installed on my laptop. Basically a closed system. Once I got it working I changed the login information and told the application to run against Digipede’s main testing Server. The only code I changed was the login information and my test application worked the first time without any other changes. That was cool.

I’ve still got some work to do to make this sample look nice but creating my first web service and Digipede-enabling it was a much shorter project than I expected. I should also add that it’s been years since I did any ASP work and I’ve never done any ASP.NET work. Microsoft has done a great job with the new VS2005.

2/13/2006

Business: How to Get Good Service at a Restaurant

I was reading a restaurant post and people where talking about how hard it is to get a server’s attention. I posted a comment on that site but I think the idea warrants further commentary. There was a lot of discussion about restaurants which have some signaling device on the table, like a light or a sign. I think those are interesting devices, but I can’t help be think...WHY? You see I believe that when you walk into a restaurant you begin a relationship with the staff. Relationships are about give-and-take and communication. It’s important to understand what both parties want and need.

What does the server want?
Servers usually make minimum wage or less, which is allowed because they get tips. It’s in the best interest of your server to make your dining experience pleasurable while getting you out the door quickly. They want to get a good tip and get the table flipped. The more tables they flip, in theory, the more money they make. That is what the wait staff wants. That is what the bar staff wants because they get tipped out by the wait staff. In some restaurants the cooks may even be tipped out.

What does the customer want?
What you want is a stress free dining experience. You want to order when you’re ready, get your food quickly, and have it prepared the way you like it. You want your drink refills promptly and dirty dishes cleared.

The average server wants to help the customer but she may lack the ability to read your mind. Thus management can create a light to summon the server, which is demeaning. Or you, the customer, can try to communicate with the server in a manner that lets the server know what you need without requiring you to wave your arms, yell, or whistle. Remember the server is trying to make you happy without being intrusive. When I lived in Atlanta, GA my roommate of five years was a career restaurant manager. I spent a lot of time with people in the restaurant business and it is from them that I learned these tricks:

- When you’re ready to order, lay your menu on the table in front of you.
- When you need a drink refill, put your glass on the edge of the table on the aisle.
- When you have a dish you want cleared put that on the edge of the table on the aisle. If the plate was from the appetizer this can also act as a signal that you are ready for the next course.
- When you are finished push your plate away from you or place it on the edge of the table. Put your napkin on the table.
- Always be pleasant and say “Thank you” each time the server responds to your clues. Remember smiles are contagious, if you start the relationship with your server with a smile it is likely to continue that way. It takes very little effort to be polite and courteous and that can make the difference between a so-so dinner and an outstanding dinner.
Sometimes I do all my communication tricks and the server is just really bad. She may not be well trained; she may be having an exceptionally bad day; whatever. That is not my problem. Each time I sit down at a restaurant I start a tip meter in my head at 20%. Each action by the server results in an adjustment, up or down. I have tipped 50% on wonderful service and I have left nothing on really poor service. I never use the quality of the food as a criterion that is something to take up with the manager.

If you haven’t ever used these tricks, try them out, and let me know if they work for you. I really think that by approaching the restaurant experience as a relationship you will have a more pleasant dining experience.

Technorati tags: , ,

2/11/2006

Links: February 10, 2006

As gross as this is...I just bought one: Tick Twister

Awesome link on leadership

Business: Lead through Volunteering....

NCWHL will be looking for coaches for the Red division for the summer season. If you live in the San Francisco Bay area, know hockey, and are interested in coaching you can contact the league coaching coordinator. A few years ago I wrote an email to encourage people to volunteer and become a coach. Yoshii asked to use the email this year since she is losing all three of her Red coaches. I gave her the go ahead and I’ve included it below.

"I would like to add to Terry’s thoughts about coaching because coaching isn’t just what you can give to others; you get a great deal back. I started coaching in 1986 and I have learned that it is truly something that I love.

Here are some of my reasons why:

1. I get the chance to share something I love with people who want to learn about it. Not only is the audience eager to learn, they actually appreciate the time and effort I put in to help them! Talk about a pat on the back!

2. I get to study people; all different kinds of people, each one approaching the world a little differently. What this helps me do is improve my communication skills. I’m much better today at reading people then I was in 1986! (And it’s not just because I’m 18 years older :-) And I’m also much better at changing the words I am using to reach different types of people.

3. I get to practice my strategic thinking and planning skills as well as my analysis skills on real people in real situations. Yes, I get a whole group of people who let me practice those skills on them! People who want me to practice those skills because they want me to help them win. At the beginning of the season I figure out what core skills my team needs to work on to take them, as a team, to the next level. Each game I watch each player to learn what she needs to work on to take her individual game to the next level. There are always those two major dynamics at work: the team level and the individual level. And it is my responsibility as the coach to balance them so that people learn and have fun. I use the information I gathered to decide what my game strategy will be and what lines I should use to achieve that strategy. And you know there are times in a hockey game when quick, small tweaks can make the difference between a win and a loss. It’s a mental challenge that I find much like chess and I personally enjoy it.

4. Because of the time I have spent coaching I am not afraid to talk to strangers or to groups of people. I’ve found a peace within those public spaces mainly because I’ve put myself out enough to learn that I’m not really in danger or threatened. My heart no longer races and my voice doesn’t crack. I’ve gained a confidence in myself that I find very valuable in my everyday life.

5. I find an incredible amount of joy in seeing a player do something that she didn’t think she could. I personally believe the greatest gift you can give to someone is to help her reach her full potential.

6. Coaching helps me understand the game better because I have to actually look at the game and figure it out so that I can teach and lead others.

7. I have learned to lead and to motivate.

8. This season I was skating in Maroon and coaching in Red and I feel that player-coaches can be a bridge between divisions. Walls have been going up between levels and this season I was able to punch a few holes in. The NCWHL plans for the summer will hopefully do far more to tear down the walls then what I did, but I personally believe that coaches can act as bridges between divisions so that people will be happy about moving up and not dreading it.

In conclusion, I am a better person because I am a coach. I strongly encourage everyone to think about what I’ve written, write me back if you want to talk about it, talk to your family and friends but think about it. NCWHL is giving you an incredible opportunity to grow as a person and to share your passion for hockey. It’s a safe environment to learn in and there are plenty of people that are willing to help you.

Kim Greenlee
Red Coach for the Mole-Whackers
March 2, 2004"


2/06/2006

Software: Excel with a State Machine and My AutomationSecurity Savior

If you’ve been reading my blog you know that I work for Digipede Technologies which is the only .NET based commercial grid computing solution currently available on the market. Grid computing can speed up and scale out computationally intensive and/or parallelizable algorithms. One big area that needs the power of grid computing is Excel spreadsheets. People build simulations in Excel and those simulations can take a long time to run. So I’m trying to figure out how to solve the various Excel problems that can come up. I’ve identified one pattern which is a Master application calling a Worker workbook. The sub pattern I’m currently working on uses a static Worker workbook.

One of the fun things about working on a leading bleeding technology front is the option of making up your own words. So what is a static Worker workbook? First I’ll define a Worker workbook. A Worker workbook is a workbook that is distributed around the grid and executed on the compute nodes. A static Worker workbook is a Worker workbook that does not require any task specific input parameters.

Well, playing around with my static Worker workbook I got myself into a pickle. To automate a static Worker workbook requires 1) The workbook computation, open, and close must be controlled in the Workbook_Open() routine 2) a state machine is needed to control the execution of computation and initiate automatic shutdown. If you don’t have a state machine then the computation and shutdown will happen every time the workbook is opened. Meaning you can’t edit it or extract the results.

My Worker workbook has three states:

1 = Editable; stay open and don’t run the computation
2 = Run; run the computation then close the workbook
3 = Extract; stay open and don't run the computation, data updated

While I was developing the state machine I made a mistake and set my machine to 2 when it should have been 3, which meant that I was unable to get the workbook open to fix the state. To get around this problem I thought I’d be smart and write a VBScript to load the workbook and change my state cell. But the Workbook_Open() kept being executed which caused the workbook to shut down. Then I discovered the Application.AutomationSecurity property which saved me. The AutomationSecurity property allows you to programmatically control whether your macros are executed or not. By default macros in an automated process are trusted, so they are executed. I did not want the macros to run because I needed the workbook to open (remember my macro was automatically shutting down the workbook). By setting the AutomationSecurity property to MsoAutomationSecurityForceDisable (enumeration value 3) I was able to disable Workbook_Open() and change my state cell from the VBScript. Here is the script:

' Create an Excel instance
Dim objExcel
Set objExcel = CreateObject("Excel.Application")

objExcel.DisplayAlerts = False
objExcel.ScreenUpdating = False

Dim objFileSystem
Set objFileSystem = CreateObject("Scripting.FileSystemObject")

Dim strWorkbook
strWorkbook = objFileSystem.GetAbsolutePathname(".") + "\StaticWorker.xls"

' Save the current security setting
Dim objSecuritySave
objSecuritySave = objExcel.AutomationSecurity

' Disable macros
objExcel.AutomationSecurity = 3

' Open the Workbook
Dim objWorkBook
Set objWorkBook = objExcel.Workbooks.Open(strWorkbook)

' Reset the State
objWorkBook.Worksheets("WorkerSheet").Range("F1") = 1
objWorkBook.Save

' Restore the security setting
objExcel.AutomationSecurity = objSecuritySave

' Clean up and shut down
objWorkBook.Close

Set objWorkSheet = Nothing
Set objWorkBook = Nothing

objExcel.ScreenUpdating = True
objExcel.DisplayAlerts = True
Set objExcel = Nothing


(Just for the record this is not a completed example of launching an automated Excel object. There are other recommended calls to disable interface components that I need to add but this did let me get back to work and I thought it might be of interest.)

2/04/2006

Business: Do You Know What You’re Really Buying?

On Friday I stopped by Safeway to pick-up a salad for lunch. This happens every week, no excitement there. While I was standing in the checkout line I noticed that the woman in front of me was buying pastries. She was buying a boat load and they were all from a local bakery that I had once toured.

During the exciting times after the Internet implosion and the following double dip recession, I was unable to find steady work as a developer or consultant, so I decided to expand my sales experience. I took a job selling chemicals. It was at this job where I had the opportunity to tour several food processing plants. One of which was the bakery that made the woman’s pastries. What an incredibly disgusting and filthy place that was. The bakery was having trouble with icing building up in drains and pipes, so I took a tour through the facility to see if I had any products that would help them out. The facility was poorly lit, had standing water on the rough concrete floor, open drains, discarded products lying around, rusty equipment, exposed rafters...as I said it was just disgusting. I haven’t bought that company’s products since I saw the plant. As I stood watching the woman buy those pastries, I couldn’t help but wonder how much of what we eat is made in similar environments.

I’ve toured an Otis Spunkmeyer plant which was so clean and well maintained that I could have eaten off the floor. A Safeway milk plant that was the same AND behind a badged security door. After seeing those plants I feel confident in the quality of the products. With all the care both companies took to cleanliness I doubt they will ever make anyone sick. And because I’ve seen the other bakery, I am willing to pay a little more for the sense of security I get from Safeway and Otis.

What a difference it would make in my purchasing decisions if I could see the manufacturing conditions. I’m certainly willing to pay a little extra for piece of mind.

What about you?

2/02/2006

Software: Remnant .NET 1.1 tag in converted .NET 2.0 csproj file

I converted several of my .NET 1.1, VS2003 sample projects to .NET 2.0, VS2005 using the conversion Wizard provided when VS2005 tries to open a VS2003 project. Each sample converted successfully. But my assembly references were wrong. Because the Digipede Network now supports .NET 2.0 we have two versions of the Digipede Framework, one for .NET 1.1 and one for .NET 2.0. This requires two sets of sample project files one for each version. Resetting the references allowed all but one of my samples to run fine.

The MasterCommandLine project was unable to find the correct Digipede.Framework.dll assembly. I looked at the project through the VS2005 interface but I didn’t see anything out of the ordinary. So even though Microsoft recommends that we not edit .sln or .csproj files ourselves. I cracked open the .csproj with notepad and looked for any fully qualified paths. What I found was this:

<AssemblyFolderKey>hklm\dn\digipede.framework.1.0.560.0</AssemblyFolderKey>

It wasn’t able to find any Microsoft documentation to explain what AssemblyFolderKey is for, but I did observe two things:

- AssemblyFolderKey was not in my other projects
- AssemblyFolderKey contained an invalid assembly name (1.0.560.0 was the first release)

So I deleted the tag and my project found the correct assembly.

Technorati tags: , ,