It has been an interesting year. I got to attend more conferences this year than any other year before. I left one company and joined another. The GBN grew and then ultimately merged into ASUG. BOB grows ever closer to the next milestone (600,000 posts and 50,000 users). My BI blog (this site) has been running for eighteen months now, which some say is the average lifespan of a personal site or blog.
I’m not done yet. I’m done for this year, but I’ll see you again shortly. 😎
Best wishes to you and yours during this holiday season.
Earlier this year I attended SAP TechEd 2009. Many of their sessions were lecture only, but they also provided a number of two or four-hour hands-on sessions. I selected one specific session in order to learn about improvements in the process used to build universes against SAP data sources like BEx queries. But of course I could not leave it at that. 🙂 I got to the session a bit early and started poking around on the laptop to see if I could get some hints as to what we were going to cover. While poking around I found a universe named “Foodmart” so I opened it. It was… interesting. Continue reading “Foodmart 2000 Universe Review – Part I: Introduction”
In the first post in this series I defined what time-sliced measures are and why they can be useful in a universe. In the second post I described a special calendar table that was designed and built to support the requirements for this solution. I also showed how the join logic worked in conjunction with the table design. This post completes the implementation. I am finally going to work on the measure objects that a user will see.
In any universe design project I strive for the following goals:
- Deliver the correct result
In my opinion, this is always the number one goal in any universe design.
- User friendly
This is quite important but secondary to correctness
- Easy to maintain
Universe maintenance is always allowed to suffer in order to provide the first two attributes on this list, but it is a worthwhile goal to strive for nonetheless
In this post I will show how all three of these goals are ultimately met by this implementation. When I am done I will have a completed universe. This post will cover slides 26 through 30 from my 2008 GBN Conference presentation. There is a link to download the file at the end of this post. Continue reading “Time Sliced Measures Part III: Making Measures”
This tip comes courtesy of Joe Szabo. I met Joe many years ago at a common client. A few weeks ago we had a casual conversation in the hallway at the 2009 GBN conference. I don’t remember how we got started but the subject of documenting complex Web Intelligence reports came up somehow. I was probably complaining and Joe said something like, “… but don’t you just do blah blah blah? That’s what I do, and it works great.” This post is going to be all about the “blah blah blah” that Joe shared with me. It will help you provide documentation for complex Web Intelligence reports. It will even help debug reports. And best of all, it will help determine exactly what is different between those two different versions of the same report so you can make sure the right version gets migrated into production.
I am going to show screenshots from Web Intelligence 3.0 for this blog post but the same process works in XI R2 as well. Continue reading “Quick Tip: Detailing Web Intelligence Document Contents”
In Part I of this series I talked briefly about the need for report writers to sometimes “make up” data. In that post I showed how I could use the Web Intelligence Rich Client (or alternatively Desktop Intelligence) to import data from a spreadsheet in order to fill out holes in data. In this post I am going to show an equivalent solution using multiple data providers from a universe instead. I will redo the same example shown before (with a lot fewer screenshots since quite a bit of the process is the same). Because I am using a universe I can show two different possible solutions. Continue reading “Making Up Data Part II: Using Universe Data”