More Packt news – Christmas $5 offer

Tags

, , , ,

So you may have guessed Packt are running the Christmas promotion again – ebooks for $5 (or £3.60 to us Brits). The promotion runs until Jan 6th.5-dollar-promo

As you can see through my blog, Packt have some great books – https://mp3muncher.wordpress.com/tag/packt/ some of which I have helped with during their development (a little shameless self promotion).

So head over to http://bit.ly/1C4FAaQ

Packt has sent another Camel along

Tags

, , , , , ,

 Having been a little inactive on the book reviewing front with Packt in the last couple of months.  They have been working on a new Camel book – in addition to the excellent Apache Camel Developer’s Cookbook and several smaller Instant books (Instant Apache Camel Message RoutingInstant Apache Camel Messaging System) and asked me to again prepublication review.

Does Packt need another Camel book?  Maybe, this new volume from what I have reviewed so far is focusing on custom development and certainly starts with the real Camel development basics.  So feels like it is more targeted to a complete newbie to Camel. Several chapters in and the book hasn’t faced into how Camel supports Enterprise Integration Patterns in a deep manner – although that may yet come.  Watch this space as I’ll  post on how the book is likely to shape up.

Mitigating Risks of Cloud Services

Tags

, , , , , , , ,

As previously blogged there are risks with using cloud that differ from self hosting solutions. SaaS, PaaS and all the other XaaS offerings aren’t a panacea. Hopefully you won’t become the next Sony as the provider keeps you patched etc. But if you’re using a SaaS provider that goes bust or you get into litigation with your provider, as result losing access to your data. It could be potentially be months whilst the lawyers sort things out. A horrible situation that no one wants to find themselves in. But how to mitigate such risks?

Any half decent SaaS provider should give directly the means to get a view of all your data through a generic or custom report (s), or will should make available the means for providing an export of your data. The later approach may well come with a cost. If your SaaS solution has a lot of data in place – for example a multinational’s HR solution you may want to just target the extract of deltas. This means extra donkey work and someone to ensure it is happening. How frequently that should depend upon your business needs through an agreed Recovery Point Objective and the tolerance to potential data loss as you can assume you’ll lose everything from the last snapshot. If you have middleware in front of your SaaS service you can have a wiretap to reduce the risk here.

Your net position is in the event of a loss or possibly a prolonged service outage (remember even Amazon have had multi-day failures & not all SaaS solutions follow good cloud practise of being able to fail to secondary centres) is that you have your data and can atleast cobble something together to bridge the gap. Unless you SaaS vendor is offering you something very unique then they’re probably going to have competitors that are more than likely to be glade to help you import the data into their solution for you.

All this for a case of paranoia? Well actually you can have harvest a raft of other benefits from taking full data extracts – for example reconciliation with a view to managing data quality – statistics from Experian show the value of resolving discrepancies. This is to say – that you might find data errors between systems as a result things like edge scenarios such as handling errors in the integration layer. To illustrate the point, let’s assume that your web sales channel is via a SaaS provider and you’re receiving the sales into your on premise ERP for fulfilment and accounting. By taking every week all transactions in the SaaS solution you can identify and discrepancies and reconcile any issues between the sales solution, your finance and fulfilment capabilities to ensure what you have sold is what you have accounted for.  If we’re talking about solutions that impact your financial accounting, then for atleast US declarations it maybe necessary to perform such reconciliation in support of Sarbanes Oxley (SOX) requirements.

Add to this a richer data set can be added to your Big Data or Data Warehouse environments allowing you to gain potentially further insights into your activities.

When you are running a hybrid of on premise and cloud solutions or event just cloud but a mix of vendors don’t just think about you application data, but consider whether audit and web traffic information can be retrieved from the vendor – there maybe value in feeding that data into a solution such as Splunk which may then find a pattern of misuse or attack that may not show up with just the monitoring data from your on premise solutions.

The final point I should make, is don’t assume your service provider will let you at the data as described – look at your contracts before any payment or act of agreement. Ideally such checks should be part of your service due diligence activities (along with ESCROW) etc. There are SaaS providers who will consider the data as their property not yours even when the data might be about your employees.

Learning Lessons from Oracle Apps User Group submission

Tags

, , , , , , , , , , , ,

As the build up to the Oracle Applications User Group conference (Collaborate) progresses the presenters have been informed of whether there submissions have been accepted.  Among many I made several submissions.

Before I share what I think I should have learnt from making submissions let me give some background to how we got to where we are. So my boss is keen that we have a member of the Enterprise Architecture team who has a strong Oracle recognition. As we are a customer rather than partner the only opportunity really is through the Ace programme as an Associate. Well I have been as active as the demands of the day job allows With the UK Oracle User Group (UKOUG). We agreed presenting at something as big as OAUG’s annual conference Collaborate would be the next step to making a case.

So whilst at Oracle  Open World we finally agreed that step and joined OAUG and found we only had a couple of weeks to get our submissions together – during which time I had to get internal sign off for my submissions plus deal with a family emergency.

So with the scene set, perhaps lesson one, don’t work in haste. OAUG run webinars about how to create submissions – a worthwhile exercise to attend although it does focus on what OAUG provides in the form of submission information (any themes for the conference identified and the amount of information needed) and the process & mechanics of selection. The important message is to temper your expectations as selection success rate is about 1 in 6 submissions. I looked at their themes and identified what I had in mind more or less fitted (big tick for me).

All of this meant I could assemble my submissions including details for my employer of what internal work and sources likely to be drawn from. Mistake here is perhaps I should have done this as soon as we had agreed to try as it would have meant I could give focus on getting my submission together sooner.

Perhaps the biggest missed opportunity, was having joined OAUG was to immediately look through previous conference papers and presentations, and most critically the ‘abstracts’ with these to get a feel of the messages, language and themes of presentations that had been accepted.  Understanding how the presentation submission might resonate with those voting on which presentations to accept could have made a big difference.  In hindsight I suspect my submission wording was a little to academic rather than informed by battle worn insights and how we’ve beaten some challenges.

All of this would help by having actually attended a previous Collaborate conference and got a feel for the ‘character’ of the conference and the people attending. I do know from Oracle Open World and Oracle’s one day sessions have some commonality in character and attendance but feel different and have some slight differences in attendance (Oracle sessions are slightly more abstract except customer presentations) and attendance can be a bit more decision maker in attendance. Where as UKOUG attendance is more orientated to those who execute delivery or drive the delivery aspects.  Then open source events differ a bit again.

To help inform thinking and learning how to progress I have been reading the excellent book Confessions of a Public Speaker by Scott Berkun (amusing and insightful book on public speaking and full of useful practical simple advise). Some may say a little masochistic given my submissions weren’t selected. But, certainly helped thinking about the approach for example really focusing down on the key message, and then how to prepare if a submission is selected.

To conclude, what now?  Well we will be applying these observations going forward, and will have done the reading of previous submissions and got together my submission ideas by the time submission opportunities open for next year – so no haste.  With a little luck will have attended Collaborate as just a delegate.  Then of course there is perhaps Open World as an opportunity.

Single Vendor Cloud

Tags

, , , , , , , , , ,

The recent outage of Microsoft Azure, raises some interesting questions. This isn’t the first big vendor cloud service outage, Amazon AWS and others have had their moments. Of course this had lead to the recommendation that to ensure your service has continuity that a DR arrangement with a different provider be in place. This works with Platform as a Service. But what we have been seeing is move from PaaS up the value stack to vendors offering their own rich ecosystem to build on – from Amazon SQS to Oracle’s latest announcement Oracle Internet of Things platform.

These solutions, can be built with open standards etc but ultimately when used create vendor lock-in as no one else will have an equivalent capability with the same APIs. So how do you mitigate these outages, or even the risk of such an outage? Well Oracle do claim you can actually run all their cloud capabilities on premise. But is that practical? As cloud is adopted organisations are going to wind back their hardware capital outlay, after all that is one of the value points of cloud.

So where does that leave us? Accepting the risk and trying to mitigate the risks in our own commercial agreements? What about the fact in an IoT solution where you’re event stream processing and using period on period comparisons to set thresholds which means the likely data loss from an outage will have both ‘echos’ as you period analysis has holes in data plus false thresholds as the data hole will skew the data when that period is being used for period comparison.

Difficult questions with no obvious answers, other than you mitigate you things commercially and push Microsoft and others to make things more robust – time for Netflix Chaos monkey?

Next Generation SOA book

Tags

, , ,

My copy of the Next Generation SOA on the Service Tech Press /Prentice Hall arrived today. This is first Prentice Hall book I have contributed to as a pre publication reviewer. It is always nice to see a recognition in the book, particularly when the draft of the book is of such a high standing your feedback is more helping finesse things.

Acknowledgements

Next Generation SOA

I have previously blogged that this is a book I would highly recommend, it isn’t a vast heavyweight text, but provides a great broad view of SOA in the current IT landscape.  If you’re not adverse to eBooks you might consider getting the ebook version as the diagrams look far better in colour than in the grayscale of the print edition.

Next Generation SOA

Thinking about a COTs Mobile Strategy

Tags

, , , , , , , ,

More and more software capability is becoming commoditised and off the shelf either in the form of COTS packages or SaaS solutions new considerations and challenges arrive for customer businesses.

Recently we had an initial conversation at the architectural level with a vendor who have a well thought out end to end offering which includes mobile apps for our customers (white label offering). This is great in many respects as we can depend on our vendor to deal with the challenges of keeping mobile apps upto date and contend with the ever changing mobile landscape.

If a business you don’t fit into classic solution stacks for example Boots (now part of Walgreens) isn’t just a retailer but also a prescription pharmacist you’re likely to be sourcing solutions from different suppliers. The challenge comes from the fact you’re likely want to leverage different white label apps for different offerings e.g. prescriptions, eye test related etc (which are potentially going to come from different vendors, particularly if you want to adopt a best of breed set of solutions). It is however important that you get a consistency in look and feel, and unified authentication to the different apps is essential when presenting solutions to end customers. The worst thing in the world would be have different authentications which means the customer has to remember multiple passwords to engage with your business.

The look and feel can be to an extend dealt with by the fact a lot of contemporary mobile applications are built on a hybrid framework so you can use CSS3 to drive standardisation or atleast colour and branding; which plays to a white label strategy.

As to authentication that could be more challenging, you need consistency in approach between all of the applications, there are standards out there such as OpenId but you the different apps to offer the same authentication sources. Even OpenId has issues, and the support for the latest version of the standard looks somewhat mixed. But in addition to that there is the fact there is a degree of fragmentation for example Facebook used to support OpenId but now has Facebook Connect.

So if you wanted to offer a voucher system from your POS (Point of Sale) provider and perhaps an app for an ordering capability built around your ERP from a different vendor what are the chances of having consistency?

If you know all your vendors and launching your mobile solutions you can look for common denominators and drive in that direction. But this is a rare situation. All of this is ideally linked to your normal website as well which may well be linked to a different solution such as Oracle’s OID.

If the apps offer their own LDAP authentication service then you have the possibility of synchronising these repositories so if the user interacts with one app then the details can be pushed to the other apps’ repository through your own integration layer.

In a perfect world your white label apps will have the means to configure to connect to a single LDAP server, in this case you can get things aligned at-least for authentication.

Without this then there is going to be a pretty challenging situation, to the point the ROI on consistent user experience needs to be seriously examined ad it maybe time to think about building the solutions yourself.

Hierarchy of Data Assurance

Tags

, , , , , ,

I was discussing the challenges of ensuring that data is protected and proven to to have integrity and that as the data moves through systems that there isn’t data loss. This sort of thing starts at the simplest level with data validation and with the most advanced and greatest investment you have some end to end reconciliation framework.

Obviously this thinking doesn’t work in every environment, for example complex event processing (CEP) your going to just accept the data coming through and if it’s incomplete or data has been lost along the way you just accept it as it is – these conditions will create outliers which will get smoothed out in trends. It is possible you will have created he gaps by dropping data slow to arrive. But for the majority of your run of the mill solutions such as accounting, HR and so on the thinking stands up.

To communicate the idea effectively to senior management on the risks of just focusing on functional delivery and whether there is maturity in the delivery capability we hit on the idea of using a variant of Maslow’s triangle of needs – something I think everyone gets. You can see our representation here:

hierarchy of Data Assurance

The interesting thing is that you could look at the triangle and suggest that typically the more pressed a project is on factors such as volume of functionality, cost and/or time the more likely a project will remain at the bottom of the triangle. But as the width of the triangle at the point of the capabilities realised also reflects on the operational costs. So if you’re at the bottom of the triangle then you’re likely to incur more costs dealing with data issues as the means to detect and then resolve are a lot more restricted.

With frameworks such as those in Oracle’s SOA Suite and AIA it should make it easier to move up atleast part of the triangle, although full end to end reconciliation is more likely to demand more data centric tools, as you probably want to perform by doing batch like assessments.

Oracle User Groups

Tags

, , , ,

It has been an active time with Oracle User Groups. In addition to the recent SIG we have been reviewing articles for the next UKOUG Scene magazine, and there have been some great submissions. In addition whilst at Oracle Open World I signed my employer up to join the OAUG and quickly put together several submissions for Collaborate 15 (OAUG conference in April). For me, OAUG also extends to being an organisational Ambassador. Then finally we have shared a profile on the UKOUG site as to why we have become members.

Oracle middleware cloud – what does it mean to Mulesoft and Apigee?

Tags

, , , ,

Oracle will soon be launching 2 cloud offerings – a hosted approach to their heavy weight SOA Suite middleware. But more importantly potentially for some of the cloud integration players like Mulesoft and Apigee is a lighter, web interface IDE solution. This lighter solution is clearly aiming (and statements made to the effective of) the Gartner pace layering ethos where you want to quickly link existing services together to offer new capabilities. This new cloud integration service will be aware of all the other cloud service APIs from Oracle you have and provide smart prebuilt transformations, which you can extend or change if you want. For non Oracle integrations the service is meant to use some intelligence and heuristics built through how other customers have realised mappings to make suggestions. With control frameworks for security, access and errors etc based policy mechanisms.

The solution includes access to prebuilt connectors to obviously Oracle products, but also the likes of Salesforce, Workday and more coming like Successfactors. When combined with other new cloud offerings such as their new mobile apps then the pacing message becomes a lot stronger. Add to this the cloud adoption of the CEP (Complex Event Processing) engine (which looks very good) and the addition of several API tools next year for catalog and realtime discovery and they will have a pretty solid suite.

With this lighter weight cloud solution there is meant to be means to pull the integrations out of the cloud and into on-premise middleware deployments. This makes sense as a lot of the capability looks to be built on top of OSB.

Add to all of this the other service offerings being launched such as Dropbox like distributed document with google doc like collaboration and there is a very potent story for the Oracle one stop shop. So you could use Oracle for best of breed integration but convenience and who got fired for buying Oracle is likely to be ruling story.

I suspect you will see Oracle appear strongly in the iPaaS assessments by Gartner soon.

Given Ellison has indicated that the new cloud services from Oracle will be aggressively priced it will be interesting to see how the smaller players differentiate themselves. I suspect one of the keys will be the speed of offering new capabilities by their cloud solutions both at the product core and through connectors. Prior to the 12c launch the rate of change in the middleware space didn’t appear to be rapid.