Tuesday, December 21, 2010

Eclipse to Visual Studio 2010: What I'm Missing

I've recently been spending a lot of my time in Visual Studio 2010. Having been in Eclipse for many years, the transition has been somewhat painful. In no particular order, here are the features that I'm still trying to track down. VS experts, help me out!

  1. Quick Fix - Assign to local variable.  Write the RHS of an assignment statement, then Ctrl+1 to have the IDE suggest/insert a LHS.
  2. Curly brace auto-completion.  When I type the opening brace and hit enter, automatically indent one tab and add a close brace on the following line.
  3. Ctrl-Click for type definition hyperlinking.  Basically, a mouse gesture for Visual Studio's Go To Definition F12. I've been told there is an extension for this??
  4. Ctrl-Click for externalized strings (resources in .NET?).  In Eclipse, when I Ctrl-Click on an externalized message referenced as a static String, I can jump to the properties file directly.
  5. "Flexible" Go To Definition. In Eclipse, if I write some code that calls a method, and I don't have the particular overload parameters sequenced correctly, I can still (typically) jump to the definition of one of the overloads. Since they're typically clustered together in the target type, this is a poor man's way of getting around reading the Javadoc.
  6. (Speaking of Javadoc), Hover over a method to see its NDoc documentation. Reading the commented XML syntax on a method doesn't do it for me.
  7. Link With Editor / Show In.  When I'm jumping around using Go To Definition, I quickly lose track of where (in the resource tree) a particular class is.  In Eclipse, I can toggle and untoggle Link With Editor to highlight the resource in Package Explorer.  How do I select the current editor's file in Solution Explorer?
  8. Synchronize View. This one is a HUGE hole. How the hell do you .NET people sync your code in a sane way??
  9. Multi-Page Editors. I'll qualify this one a bit by saying that I've seen the implementation details of multi-page editors in Eclipse, and there are some definite issues. But the functionality is critical when you have an XML document which also has a graphical editor. (Entity Framework conceptual models, I'm lookin' at you.) Gimme a synchronized two-tab editor so I don't have to keep doing Open With, closing the other editor (can't have both open at the same time, another FAIL), etc.
  10. Outline View in the XML editor.  'Nuff said.
  11. Collapse All / Expand All.  Solution Explorer needs these badly.
  12. File -> Restart. After I install a Visual Studio Extension, I need to restart. Please make this one click, thanks. 
  13. Workspaces.  I really don't know how to explain this one. You can see one of my complaints (relating to Team Explorer) in this post. Another example is that of "recent solutions". Please just keep my last solution open?  What am I missing here?  I need to have separate "threads" of work that are preserved somehow, so I can easily switch between conceptual projects in my daily work.

Please post back in the comments with answers to all of these so that I don't resort to editing C# code in Eclipse. Thanks.

Thursday, November 11, 2010

Testing declarative Eclipse expressions

Our Eclipse-based product plugs into the platform debug support. And we have a large number of launch shortcuts which have enablement expressions specified in plugin.xml. And they are a huge pain in the ass to test manually. If a particular shortcut is disabled when it should be enabled, there isn't a good way (that I've seen) to trace the evaluation of the enablement expression at runtime. So it comes down to stepping through the deeply nested Expression.evaluate(..) calls, and you can easily get lost.

This is one of those times where I feel like the light bulb should have gone on for me years ago. It turns out there is a much easier way to test these things: by writing JUnit Plug-in Tests for them.

What follows is some sample code to do this. Since I'm testing the enablement of launch shortcuts, the configuration elements are specific to that scenario, but you should be able to do this for any expression that appears in a bundle's plugin.xml.

In a @Before method, I read the expression from the plugin.xml and convert it into an Expression object.

IExtensionPoint extensionPoint = Platform.getExtensionRegistry().getExtensionPoint(IDebugUIConstants.PLUGIN_ID, IDebugUIConstants.EXTENSION_POINT_LAUNCH_SHORTCUTS);
IConfigurationElement[] infos = extensionPoint.getConfigurationElements();
for (IConfigurationElement elem : infos) {
  String id = elem.getAttribute("id");
  IConfigurationElement contextLaunch = null;
  IConfigurationElement enablement = null;
  if ("myShortcutId".equals(id)) {
    contextLaunch = elem.getChildren("contextualLaunch")[0];
    enablement = contextLaunch.getChildren("enablement")[0];
    expression = ExpressionConverter.getDefault().perform(enablement);
}


And in a @Test method, I set up the context I'm trying to mimic and evaluate the expression.

@Test
public void ui() throws Exception {
  List<object> ctxt = new ArrayList<object>();
  ctxt.add(selectedObject /*Some IResource, perhaps*/);
  IEvaluationContext context = new EvaluationContext(null, ctxt);
  context.addVariable("selection", ctxt); //$NON-NLS-1$
  assertEquals(EvaluationResult.valueOf(m_uiEnabled),  expression.evaluate(context));
}

Now I can muck around in those delicately constructed XML expressions and rest assured that I have a test suite watching my back.

Monday, February 1, 2010

JUnit4 and the Eclipse Test Framework: Success!

(If you're eager, just jump down to the bolded part about victory.)

First, some background. I think the Eclipse Test Framework is one of the gems that is part of Eclipse. Effectively it's a way to run JUnit tests against your bundles while Equinox OSGi is running. You can also have the whole workbench UI running if you like, which I think is a common case. In fact, for quite some time, that was the *only* case that I considered it for. If I had a JUnit test that didn't need the workbench, I could "Run As -> Junit Test". If I had a Junit test that needed the workbench, I could "Run As -> Junit Plug-in Test". But I never really mentally equated the "workbench" with "OSGi framework."

Then I went to write a test for one of my bundles that has a dependency on Jetty. And its a version of Jetty that is different from the one that ships with Eclipse. When I ran it as a vanilla "Junit Test" from within Eclipse, everything worked fine. I think I just "got lucky" and it picked up the right version of Jetty when I launched the test. But when I ran my test via the Ant "junit" task (no Eclipse Test Framework), it failed because the wrong version of Jetty got loaded. All of a sudden I realized that without my delicately constructed dependencies being managed by OSGi, I was lost back in the world of "which JAR file is getting used?"

So, this brings me to the real guts of this blog post. After discovering I needed OSGi during test execution, I took a second look at the Eclipse Test Framework. Wow, this looks promising! Headless OSGi-based bundle testing! Just what I need. So I converted my build script to go that route, only to run smack dab into bug 153429, which captures that the Eclipse Test Framework only supports JUnit3.

That was at least a year ago, maybe two. I commented out my Jetty test and continued running my other JUnit4 tests using Ant. I even wrote some JUnit3 tests that I could run against the Eclipse Test Framework. Meanwhile, the CC list on bug 153429 continued to build and milestones kept passing without any progress. I stubbornly left alone my Jetty test, refusing to rewrite it for Junit3 since I figured Junit4 on ETF was right around the corner.

Finally with Eclipse 3.6M5 we have complete and utter victory. I have successfully executed my Junit4 tests with the Eclipse Test Framework. I'm not building my product against 3.6 yet (still 3.5.0 actually), but I still was able to grab the zip for ETF and make it part of my base Eclipse for PDE build to consume.

Now that I have my tests working, I do have some observations to share.

Rather than launching the SDK to run my tests, I'm launching my own product, spit out by PDE build. Part of the reason is that I have some additional plug-in tests written using WindowTester, and they care that it's my product which is launched, not the SDK.

There was a gotcha with this. ETF was created before p2. So you just dropped your test bundles into the "plugins" folder of the SDK, launched the SDK specifying the ETF application and test bundle / test class, and off you go. Today, if you do the same, and you're launching the SDK to run your tests, it will still work because the SDK ships with the dropins reconciler enabled. My product does not (which is the default, I believe). So I couldn't simply drop in my test bundles and make them available for ETF to find.

The solution to this is actually quite clean and works nicely.

  1. Enclose test bundles in a test feature, expressing the appropriate dependency on ETF. (Hint: I had to edit the feature.xml manually to include it, since I didn't make ETF part of my target, so I couldn't pick it from the PDE editors.)
  2. Run PDE build to build the test feature after building the product.
  3. Save off a copy of the product before I test with it.
  4. Formally install the test feature into the product using the p2 director.
  5. Launch the product, specifying the ETF application, and a test bundle / test class that has been installed into the product.

I've also included in my test feature the EMMA OSGi bundle for measuring the coverage provided by my bundle tests.

Thanks to everyone who voted, provided patches, and general community support to get the ETF supporting JUnit4. It really completes the testing package that is available to developers working with Eclipse and more generally OSGi.

Tuesday, June 30, 2009

p2 UI policy and Declarative Services

This is another post in what is becoming a short (so far only two) series about moving a product from 3.4 to 3.5.

After I got my build working, the next step was making sure that I could update from one product version to the next. I was especially excited about the resolution of https://bugs.eclipse.org/bugs/show_bug.cgi?id=246060 which allows for .qualifier to be replaced in a product version. No longer would I have to manually increment the product version number for purposes of updating to and testing a nightly build.

So my plan was:

  1. Run PDE product build to generate version 1.0.0.abc
  2. Unzip 1.0.0.abc to some location.
  3. Run PDE product build again to generate version 1.0.0.def
  4. Launch 1.0.0.abc, point it at the repository for 1.0.0.def, and update.
  5. ...
  6. Profit.

Unfortunately, when I launched 1.0.0.abc, the Install New Software dialog didn't have a way for me to add a new repository. Ditto for the preference page.

Turns out there is a more robust set of p2 UI building blocks in 3.5, which is handy for RCP developers. That is described in great detail here: http://wiki.eclipse.org/Equinox/p2/Adding_Self-Update_to_an_RCP_Application

I should mention that the RCP-p2 example in 3.5 is leaps and bounds ahead of the one from 3.4 (there wasn't one) - so props to the p2 UI team on that.

At any rate, the wiki page tipped me off that there is a UI policy which controls what components are showing and enabled. This policy is implemented as an OSGi declarative service. What really threw me for a loop is that I wasn't trying to do anything special with this policy. I just wanted the stock SDK one since our product is based on the SDK.

Debugging the Policy Behavior

I stepped through the preference page code and discovered that the SDKPolicy wasn't getting discovered as a service (it was just getting an empty Policy every time). So this sent me down the route of launching with -console to see the OSGi console and look for the policy service. After fighting with the filter syntax for the services <filter> console command, I googled a bit more and found these useful runtime options for spitting out verbose DS logging information. I turned those on but I didn't get anything logged. I was pretty stumped at this point.

Then a light bulb came on: maybe declarative services wasn't running at all? A quick ss ds at the console showed that it was RESOLVED but not active! I did a start to spin it up and all of a sudden a deluge of DS logging information printed out. And then SDKPolicy started working, and voila my p2 UI was working.

It turns out the root cause is that we had a custom config.ini in 3.4 to specify a custom osgi.instance.area location. This was screwing up the start level for the ds bundle. I switched the product to generate a config.ini for me, did a new build, and everything worked. I plan to migrate the osgi.instance.area configuration step to a p2.inf file, which is what the platform releng guys do.

Useful Links

[1] Equinox Runtime Options
[2] Explore Eclipse's OSGi Console
[3] Around the world in Java: Getting Started with OSGi Declarative Services
[4] p2 UI policy bug #1
[5] p2 UI policy bug #2

Monday, June 29, 2009

Debugging PDE Build and the publisher

I posted a problem to the PDE newsgroup last week about unexpected requirements in my product feature. This was in the context of moving a 3.4-based product to 3.5.

The general issue was that the director wouldn't install my product because of an unsatisfied requirement. It wasn't clear to me where this requirement was even coming from. Somewhere, there was some metadata in my plugins/features that expressed a dependency that had worked fine in 3.4 but failed in 3.5. My theory was that if I could capture when the publisher was generating the requirement, I'd be able to see the source of that requirement and squash it.

Tracing

First attempt was to turn on tracing for the p2 components. I managed to find the org.eclipse.equinox.internal.p2.core.helpers.Tracing class which listed out the different options. I stuffed those into a .options file:

org.eclipse.equinox.p2.core/debug=true
#org.eclipse.equinox.p2.core/generator/parsing=true
#org.eclipse.equinox.p2.core/engine/installregistry=true
#org.eclipse.equinox.p2.core/metadata/parsing=true
#org.eclipse.equinox.p2.core/artifacts/mirrors=true
#org.eclipse.equinox.p2.core/core/parseproblems=true
#org.eclipse.equinox.p2.core/planner/operands=true
#org.eclipse.equinox.p2.core/planner/projector=true
#org.eclipse.equinox.p2.core/engine/profilepreferences=true
org.eclipse.equinox.p2.core/publisher=true
#org.eclipse.equinox.p2.core/reconciler=true
#org.eclipse.equinox.p2.core/core/removeRepo=true
#org.eclipse.equinox.p2.core/updatechecker=true

Then the trick was to pass along those options to the AntRunner app which drives PDE build. I added -debug path/to/.options into my arguments to AntRunner. Running the build again I got two things, neither of which were helpful:

  1. Passing -debug to the Platform also passes -debug onto Ant, thanks to AntRunner. So my Ant ran in debug mode which really clouded the issue with about 8mb of debug output.
  2. The publisher only outputs two trace statements: start and finish. Nothing about what it is publishing. This may be a candidate for enhancement.

Based on these results, I reasoned that nobody else must be using this technique to solve their p2 problems. Moving on.

Stepping through the publisher

Next up: run AntRunner with Java debug enabled so that I could connect remotely and set breakpoints in the publisher actions. I added the appropriate JVM args to enable the Java wire debug protocol. Started the build again, connected up and started setting breakpoints in various publisher actions.

Since the rogue requirement was getting added to my product feature IU, I added a conditional breakpoint in FeaturesAction to look for that feature being processed.

Then, since the problematic requirement was org.eclipse.core.resources [3.4.0,3.5.0) I added another conditional breakpoint in getVersionRange to watch for incoming feature entries with 3.4.0 as their minimum version.

I did finally discover the problem: I had a bunch of old, outdated entries in my product feature's feature.xml, which included references to several different versions of o.e.core.resources. After I ripped those out, I had a successful build and director install.

Conclusions

  • Do not pass the debug flag to AntRunner for purposes of debugging platform code unless you are prepared to wade through volumes of output. (I guess this is a feature of AntRunner - https://bugs.eclipse.org/bugs/show_bug.cgi?id=5672)
  • It was not at all apparent to me to debug p2 actions by setting up a "remote" debug session with PDE build running inside of AntRunner. But it was sure as heck helpful once I figured it out.
  • I am actually glad that I ran across this problem, and that p2 is enforcing these types of constraints, because it helped me clean up outdated dependencies in my feature.

How are you debugging your p2 builds??

Sunday, May 3, 2009

Cloning a profile using p2

I heard at the p2 BOF at EclipseCon that you could use your profile as a p2 repository. This sounded like a cool way to take an Eclipse setup that you've customized and replicate it into a new install. Granted there may be better ways to do this with shared bundle pools and what have you, but lets set that aside for a moment.

Since 3.5M7 is out, I decided to take a stab at cloning my 3.5M6 setup into 3.5M7. I had installed SVN and DTP along the way so I was hoping this would take the pain out of having to go out and re-provision those items from the web. (Of course I should probably upgrade DTP to the M7 version, but again, lets set that aside.)

Steps:

1. Unpack 3.5M7 into a new directory
2. Launch it
3. Help -> Install new software...
4. Click the Add button to add a site
5. Click Local to browse
6. Select the .profile directory from previous install. In my case, file:/C:/eclipse3.5M6/p2/org.eclipse.equinox.p2.engine/profileRegistry/SDKProfile.profile/
7. Give it a name, hit OK
8. You should see all your plugins from that profile. You might need to uncheck "Show only the latest versions" and "Group items by category"
9. Check all the plug-ins you want (sadly, I couldn't find an action to mark multiple at a time)
10. Finish the wizard and restart the platform



And you're done! Awesome. Thanks to Simon Kaegi for implementing this.

Monday, April 20, 2009

Favoring immutability

A few years ago, when I read Effective Java for the first time, Item 13 really stuck in my mind: make your objects immutable wherever possible. You can share them freely. You don't have to worry about checking to see if that Person object you created still has a name. You made sure of that when you created it. And while I don't do a lot of concurrency programming, I've seen time and time again how much you get "for free" when you use immutable objects in multi-threaded environments. So that's another benefit.

OK, so I read the book, and moved on. Then I start working on a new project. We happened to pick GWT as part of the tool stack. During the early stages, a light bulb goes off in my head: hey, I'm writing a bunch of new beans in Java - I should make them immutable! Silver bullet, right? All my problems will go away. So I write a bunch of immutable beans.

Then I tried to serialize my immutable objects, and BAM, I hit this RFE: Serialize final fields. Turns out, its actually kind of hard to take an immutable object, vaporize it into a bunch of bits, and then reassemble those bits back into an immutable object on the other side. So I ended up with a bunch of objects that really, really wanted to be immutable but couldn't. Frustrating. Turns out that a "90% immutable object" just doesn't have the same feel to it.

Alright, so lesson learned (so I thought), moved on. Then I started working on a different project. Same type of deal, writing a bunch of new beans, want to make them immutable so I don't have to worry about whether or not that Person has a name. This time I'm working with JAX-WS, and again the serialization problem hits me right in the face. Ultimately I end up leaving my immutable beans alone, and writing a bunch of data transfer objects (similar to the immutable ones, but mutable). Then I could send those across and reassemble them into immutable versions on my own.

The saga continued when I made my way into JFace Data Binding. I had my Person with their name, age, height, weight, etc. I wanted to have a simple data entry form for a new Person. Problem is, with my immutable Person object, I basically have to set all those properties all at once, during construction. Stopped in my tracks again, since for data binding I basically need a standard Java bean with getters AND setters. Again I find myself writing intermediate objects which ultimately become their immutable counterparts.

The last part of the story came today, when I was poking around the p2 APIs. I was trying to see how the various properties of an IInstallableUnit got set. I find that there are no setters on the interface. Then I stumble across InstallableUnitDescription:

Once created, installable units are immutable. This description class allows a client to build up the state for an installable unit incrementally, and then finally product the resulting immutable unit.
A-ha! So I'm not the only one doing a bunch of intermediate stuff just to get to that final immutable object (pun intended).

So I put these question to you readers: How are you using immutable objects in your application? How about in your Eclipse applications? What lessons have you learned?