<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>rmoff&#39;s random ramblings</title>
    <link>https://rmoff.net/</link>
    <description>Recent content on rmoff&#39;s random ramblings</description>
    <generator>Hugo</generator>
    <language>en-us</language>
    <lastBuildDate>Fri, 13 Mar 2026 15:34:37 +0000</lastBuildDate>
    <atom:link href="https://rmoff.net/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>Evaluating Claude&#39;s dbt Skills: Building an Eval from Scratch</title>
      <link>https://rmoff.net/2026/03/13/evaluating-claudes-dbt-skills-building-an-eval-from-scratch/</link>
      <pubDate>Fri, 13 Mar 2026 15:34:37 +0000</pubDate>
      <guid>https://rmoff.net/2026/03/13/evaluating-claudes-dbt-skills-building-an-eval-from-scratch/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I wanted to explore the extent to which Claude Code could build a data pipeline using dbt without iterative prompting.&#xA;What difference did skills, models, and the prompt itself make?&#xA;I’ve written &lt;a href=&#34;https://rmoff.net/2026/03/11/claude-code-isnt-going-to-replace-data-engineers-yet/&#34;&gt;in a separate post&lt;/a&gt; about what I found (&lt;em&gt;yes it’s good; no it’s not going to replace data engineers, yet&lt;/em&gt;).&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In this post I’m going to show how I ran these tests (with Claude) and analysed the results (using Claude), including a pretty dashboard (created by Claude):&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;span class=&#34;image&#34;&gt;&lt;img src=&#34;https://rmoff.net/images/2026/03/dashboard01.webp&#34; alt=&#34;dbt EVAL dashboard showing test results across prompt&#34; width=&#34;skill&#34; height=&#34;and model combinations with LLM judge scores out of 27&#34;/&gt;&lt;/span&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How I do, and don&#39;t, use AI on this blog</title>
      <link>https://rmoff.net/ai/</link>
      <pubDate>Thu, 12 Mar 2026 11:16:15 +0000</pubDate>
      <guid>https://rmoff.net/ai/</guid>
      <description>&lt;div class=&#34;admonitionblock important&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-important&#34; title=&#34;Important&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;&lt;div class=&#34;title&#34;&gt;tl;dr&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;I use AI heavily on this blog.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;I don’t use AI to write any content.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;hr/&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;As any followers of my blog will have seen recently, I am &lt;a href=&#34;https://rmoff.net/categories/ai/&#34;&gt;a big fan&lt;/a&gt; of the &lt;a href=&#34;https://rmoff.net/2026/01/27/reflections-of-a-developer-on-llms-in-january-2026/&#34;&gt;productivity&lt;/a&gt;—and enjoyment—that AI can bring to one’s work.&lt;/em&gt;&#xA;&lt;em&gt;(In fact, &lt;a href=&#34;https://rmoff.net/2026/03/06/ai-will-fuck-you-up-if-youre-not-on-board/&#34;&gt;I firmly believe&lt;/a&gt; that to opt out of using AI is a somewhat negative step to take in terms of one’s career.)&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;strong&gt;Here’s how I don’t use AI, &lt;em&gt;and never will&lt;/em&gt;&lt;/strong&gt;:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Claude Code isn&#39;t going to replace data engineers (yet)</title>
      <link>https://rmoff.net/2026/03/11/claude-code-isnt-going-to-replace-data-engineers-yet/</link>
      <pubDate>Wed, 11 Mar 2026 15:16:15 +0000</pubDate>
      <guid>https://rmoff.net/2026/03/11/claude-code-isnt-going-to-replace-data-engineers-yet/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Ten years late (but hopefully not &lt;a href=&#34;https://en.wiktionary.org/wiki/a_day_late_and_a_dollar_short&#34;&gt;a dollar short&lt;/a&gt;) I recently figured out &lt;a href=&#34;https://rmoff.net/2026/02/19/ten-years-late-to-the-dbt-party-duckdb-edition/&#34;&gt;what all the fuss about dbt is about&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;quoteblock&#34;&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;em&gt;Well that’s cute, Robin&lt;/em&gt;, you might be saying.&#xA;&lt;em&gt;Congratulations for catching up on what data/analytics engineers have been doing for years now.&lt;/em&gt;&#xA;&lt;em&gt;But you see, coding by hand is &lt;strong&gt;so&lt;/strong&gt; 2025.&lt;/em&gt;&#xA;&lt;em&gt;Didn’t you hear?&lt;/em&gt;&#xA;&lt;strong&gt;AI is going to replace data engineers.&lt;/strong&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;No it’s not (at least, not yet).&#xA;In fact, used incorrectly, it’ll do a worse job than you.&#xA;But used right, it’s a kick-ass tool that any data engineer should be adding to their toolbox &lt;em&gt;today&lt;/em&gt; &lt;sup&gt;&lt;a href=&#34;#srsly&#34;&gt;*&lt;/a&gt;&lt;/sup&gt;.&#xA;In this article I’ll show you why.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Claude Code in action with dbt</title>
      <link>https://rmoff.net/2026/03/11/claude-code-in-action-with-dbt/</link>
      <pubDate>Wed, 11 Mar 2026 15:15:15 +0000</pubDate>
      <guid>https://rmoff.net/2026/03/11/claude-code-in-action-with-dbt/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;This is an addendum to the main post about using Claude Code with dbt&lt;/em&gt;.&#xA;&lt;em&gt;It shows an excerpt of a Claude session log so you can see exactly what goes on &amp;#34;under the covers&amp;#34;&lt;/em&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;For full details of the prompt, commentary, and conclusions, see &lt;a href=&#34;https://rmoff.net/2026/03/11/claude-code-isnt-going-to-replace-data-engineers-yet/&#34;&gt;&lt;strong&gt;Claude Code isn’t going to replace data engineers (yet)&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Here we can see the steps that Claude Code takes as it figures out for itself anomalies in the data and adapts the dbt model to handle them.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>AI will fuck you up if you’re not on board</title>
      <link>https://rmoff.net/2026/03/06/ai-will-fuck-you-up-if-youre-not-on-board/</link>
      <pubDate>Fri, 06 Mar 2026 15:16:15 +0000</pubDate>
      <guid>https://rmoff.net/2026/03/06/ai-will-fuck-you-up-if-youre-not-on-board/</guid>
      <description>&lt;div class=&#34;sect1&#34;&gt;&#xA;&lt;h2 id=&#34;_yes_youre_right&#34;&gt;Yes, you’re right&lt;/h2&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;AI slop is &lt;a href=&#34;https://rmoff.net/2025/11/25/ai-smells-on-medium/&#34;&gt;ruining the internet&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Given half a chance AI will delete your inbox or worse (even if you work in Safety and Alignment at Meta):&lt;/p&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;blockquote class=&#34;twitter-tweet&#34;&gt;&lt;p lang=&#34;en&#34; dir=&#34;ltr&#34;&gt;Nothing humbles you like telling your OpenClaw “confirm before acting” and watching it speedrun deleting your inbox. I couldn’t stop it from my phone. I had to RUN to my Mac mini like I was defusing a bomb. &lt;a href=&#34;https://t.co/XAxyRwPJ5R&#34;&gt;pic.twitter.com/XAxyRwPJ5R&lt;/a&gt;&lt;/p&gt;&amp;mdash; Summer Yue (@summeryue0) &lt;a href=&#34;https://twitter.com/summeryue0/status/2025774069124399363?ref_src=twsrc%5Etfw&#34;&gt;February 23, 2026&lt;/a&gt;&lt;/blockquote&gt;&#xA;&lt;script async src=&#34;https://platform.twitter.com/widgets.js&#34; charset=&#34;utf-8&#34;&gt;&lt;/script&gt;&#xA;&#xA;&#xA;&lt;/div&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Low-effort AI contributions are &lt;a href=&#34;https://redmonk.com/kholterhoff/2026/02/03/ai-slopageddon-and-the-oss-maintainers/&#34;&gt;harming the open-source ecosystem&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;LLMs hallucinate&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;…etc etc, ad infinitum.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;sect1&#34;&gt;&#xA;&lt;h2 id=&#34;_but_youre_also_so_so_wrong&#34;&gt;But you’re also so, so wrong.&lt;/h2&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;</description>
    </item>
    <item>
      <title>Interesting links - February 2026</title>
      <link>https://rmoff.net/2026/02/27/interesting-links-february-2026/</link>
      <pubDate>Fri, 27 Feb 2026 09:06:08 +0000</pubDate>
      <guid>https://rmoff.net/2026/02/27/interesting-links-february-2026/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Phew, what a month!&#xA;February may be shorter but that’s not diminished the wealth of truly interesting posts I’ve found to share with you this month.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Claude the Instructor</title>
      <link>https://rmoff.net/2026/02/20/claude-the-instructor/</link>
      <pubDate>Fri, 20 Feb 2026 15:21:07 +0000</pubDate>
      <guid>https://rmoff.net/2026/02/20/claude-the-instructor/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;How do you use your LLM coding agent?&#xA;Mine is usually Claude the proofreader, Claude the bash monkey, Claude the webdev.&#xA;All these things are about tasks &lt;em&gt;completed&lt;/em&gt;.&#xA;Read this, write that code, fix that web page.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This week I gave Claude a new job.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Ten years late to the dbt party (DuckDB edition)</title>
      <link>https://rmoff.net/2026/02/19/ten-years-late-to-the-dbt-party-duckdb-edition/</link>
      <pubDate>Thu, 19 Feb 2026 12:28:13 +0000</pubDate>
      <guid>https://rmoff.net/2026/02/19/ten-years-late-to-the-dbt-party-duckdb-edition/</guid>
      <description>&lt;div class=&#34;quoteblock&#34;&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Apparently, you &lt;strong&gt;can&lt;/strong&gt; teach an old dog new tricks.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Last year I wrote &lt;a href=&#34;https://rmoff.net/2025/03/20/building-a-data-pipeline-with-duckdb/&#34;&gt;a blog post&lt;/a&gt; about building a data processing pipeline using DuckDB to ingest weather sensor data from the &lt;a href=&#34;https://environment.data.gov.uk/flood-monitoring/doc/reference&#34;&gt;UK’s Environment Agency&lt;/a&gt;.&#xA;The pipeline was based around a set of SQL scripts, and whilst it used important data engineering practices like data modelling, it sidestepped the elephant in the room for code-based pipelines: dbt.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Cosplaying as a webdev with Claude Code in January 2026</title>
      <link>https://rmoff.net/2026/01/27/cosplaying-as-a-webdev-with-claude-code-in-january-2026/</link>
      <pubDate>Tue, 27 Jan 2026 14:42:42 +0000</pubDate>
      <guid>https://rmoff.net/2026/01/27/cosplaying-as-a-webdev-with-claude-code-in-january-2026/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;strong&gt;&lt;em&gt;In which Claude and [A]I play at being webdevs.&lt;/em&gt;&lt;/strong&gt;&#xA;&lt;strong&gt;&lt;em&gt;For some reflections on the bigger picture of AI as a productivity tool for developers, have a look at &lt;a href=&#34;https://rmoff.net/2026/01/27/reflections-of-a-developer-on-llms-in-january-2026/&#34;&gt;the companion post to this one&lt;/a&gt;.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I used to speak at a lot of conferences and meetups, and published my talks on a site called &lt;code&gt;noti.st&lt;/code&gt;.&#xA;It’s free to use, but you could pay for bells and whistles including a custom domain, which I duly did: &lt;code&gt;talks.rmoff.net&lt;/code&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Reflections of a Developer on LLMs in January 2026</title>
      <link>https://rmoff.net/2026/01/27/reflections-of-a-developer-on-llms-in-january-2026/</link>
      <pubDate>Tue, 27 Jan 2026 13:40:58 +0000</pubDate>
      <guid>https://rmoff.net/2026/01/27/reflections-of-a-developer-on-llms-in-january-2026/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Funnily enough, Charles Dickens was talking about late 18th century Europe rather than the state of AI and LLMs in 2026, but here goes:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;quoteblock&#34;&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of light, it was the season of darkness, it was the spring of hope, it was the winter of despair.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interacting with Developers on Reddit</title>
      <link>https://rmoff.net/2026/01/23/interacting-with-developers-on-reddit/</link>
      <pubDate>Fri, 23 Jan 2026 16:25:37 +0000</pubDate>
      <guid>https://rmoff.net/2026/01/23/interacting-with-developers-on-reddit/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;LLMs are &lt;em&gt;rapidly&lt;/em&gt; changing how we use the internet.&#xA;Remember just a few years ago when you’d search for something on Google and &lt;em&gt;scroll&lt;/em&gt; through the results like some kind of Neanderthal?&#xA;Heck, you might even click through to page 2 if you were feeling spicy.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;These days—&lt;em&gt;and, knowing how this stuff ages, I should perhaps be less broad than &amp;#34;these days&amp;#34; and say just &amp;#34;in January 2026&amp;#34;&lt;/em&gt;—Google’s AI Overview at the top of the results has got pretty good for basic stuff, making looking at the actual search results less necessary.&#xA;That’s if folk even &lt;em&gt;get&lt;/em&gt; to Google, when they’ve got an LLM close at hand to answer any and every question that they throw at it (regardless of whether it’s a lazy &amp;#34;&lt;em&gt;how do you spell irony&lt;/em&gt;&amp;#34; or somewhat more LLM-appropriate &amp;#34;&lt;em&gt;ELI5 nuclear fusion&lt;/em&gt;&amp;#34;).&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interesting links - January 2026</title>
      <link>https://rmoff.net/2026/01/20/interesting-links-january-2026/</link>
      <pubDate>Tue, 20 Jan 2026 10:14:50 +0000</pubDate>
      <guid>https://rmoff.net/2026/01/20/interesting-links-january-2026/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;This is the twelfth edition of this newsletter in its current form.&lt;/em&gt;&#xA;&lt;em&gt;It’s great to see the audience for it growing, and consistently positive reception when I share it.&lt;/em&gt;&#xA;&lt;em&gt;Nice words always inspire me to carry on with it :D&lt;/em&gt;&#xA;&lt;em&gt;The &lt;a href=&#34;https://interestinglinks.substack.com/&#34;&gt;substack edition&lt;/a&gt; (which is exactly the same content but sent out by email), is also picking up views and subscribers.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Alternatives to MinIO for single-node local S3</title>
      <link>https://rmoff.net/2026/01/14/alternatives-to-minio-for-single-node-local-s3/</link>
      <pubDate>Wed, 14 Jan 2026 09:42:10 +0000</pubDate>
      <guid>https://rmoff.net/2026/01/14/alternatives-to-minio-for-single-node-local-s3/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In late 2025 the company behind MinIO &lt;a href=&#34;https://github.com/minio/object-browser/pull/3509&#34;&gt;decided&lt;/a&gt; &lt;a href=&#34;https://github.com/minio/minio/issues/21647#issuecomment-3418675115&#34;&gt;to&lt;/a&gt; &lt;a href=&#34;https://github.com/minio/minio/commit/27742d469462e1561c776f88ca7a1f26816d69e2&#34;&gt;abandon&lt;/a&gt; it to pursue other commercial interests.&#xA;As well as upsetting a bunch of folk, it also put the cat amongst the pigeons of many software demos that relied on MinIO to emulate S3 storage locally, not to mention build pipelines that used it for validating S3 compatibility.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In this blog post I’m going to look at some alternatives to MinIO.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>A love letter to Raycast ❤️</title>
      <link>https://rmoff.net/2025/12/18/a-love-letter-to-raycast/</link>
      <pubDate>Thu, 18 Dec 2025 15:59:31 +0000</pubDate>
      <guid>https://rmoff.net/2025/12/18/a-love-letter-to-raycast/</guid>
      <description>&lt;div class=&#34;quoteblock&#34;&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&amp;#34;What are the must-have apps to install on my new Mac?&amp;#34;…&#xA;&amp;#34;Which tool makes you the most productive?&amp;#34;…&#xA;&amp;#34;Do you still use Alfred?&amp;#34;…&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;All these questions and more—and the answer to all of them is Raycast!&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interesting links - December 2025</title>
      <link>https://rmoff.net/2025/12/16/interesting-links-december-2025/</link>
      <pubDate>Tue, 16 Dec 2025 10:08:00 +0000</pubDate>
      <guid>https://rmoff.net/2025/12/16/interesting-links-december-2025/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Well it’s that time of year already!&#xA;Whilst munching on a &lt;a href=&#34;https://en.wikipedia.org/wiki/Mince_pie&#34;&gt;mince pie&lt;/a&gt;, enjoy the final Interesting Links for 2025.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;It’s been a busy twelve months for me; this time last year I was signing off from my last company, which went on to be &lt;a href=&#34;https://redis.io/blog/redis-to-acquire-decodable-to-turbocharge-our-real-time-data-platform/&#34;&gt;acquired&lt;/a&gt;—and last week I found out that my current company (Confluent) is to be &lt;a href=&#34;https://newsroom.ibm.com/2025-12-08-ibm-to-acquire-confluent-to-create-smart-data-platform-for-enterprise-generative-ai&#34;&gt;acquired by IBM&lt;/a&gt;.&lt;/em&gt;&#xA;&lt;em&gt;Despite my reaction against any kind of &lt;a href=&#34;https://en.wikipedia.org/wiki/Who_Moved_My_Cheese%3F&#34;&gt;cheese moving&lt;/a&gt;, I figure this is going to be an interesting development and a whole new experience for me :)&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Using Graph Analysis with Neo4j to Spot Astroturfing on Reddit</title>
      <link>https://rmoff.net/2025/12/01/using-graph-analysis-with-neo4j-to-spot-astroturfing-on-reddit/</link>
      <pubDate>Mon, 01 Dec 2025 11:55:22 +0000</pubDate>
      <guid>https://rmoff.net/2025/12/01/using-graph-analysis-with-neo4j-to-spot-astroturfing-on-reddit/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Reddit is one of the longer-standing platforms on the internet, bringing together folk to discuss, rant, grumble, and troll others on all sorts of topics, from &lt;a href=&#34;https://old.reddit.com/r/apachekafka/&#34;&gt;Kafka&lt;/a&gt; to &lt;a href=&#34;https://old.reddit.com/r/dataengineering/&#34;&gt;data engineering&lt;/a&gt; to &lt;a href=&#34;https://old.reddit.com/r/flashlight/&#34;&gt;nerding out over really bright torches&lt;/a&gt; to &lt;a href=&#34;https://old.reddit.com/r/britishproblems/&#34;&gt;grumbling about the state of the country&lt;/a&gt;—and a whole lot more.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;As a social network it’s a prime candidate for using graph analysis to examine how people interact—and in today’s post, hunt down some sneaky shills ;-)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interesting links - November 2025</title>
      <link>https://rmoff.net/2025/11/26/interesting-links-november-2025/</link>
      <pubDate>Wed, 26 Nov 2025 09:52:09 +0000</pubDate>
      <guid>https://rmoff.net/2025/11/26/interesting-links-november-2025/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Welcome to the 10th edition of &lt;em&gt;Interesting Links&lt;/em&gt;.&#xA;I’ve got over a hundred links for you this month—all of them, IMHO, interesting :)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>(AI) Smells on Medium</title>
      <link>https://rmoff.net/2025/11/25/ai-smells-on-medium/</link>
      <pubDate>Tue, 25 Nov 2025 12:46:34 +0000</pubDate>
      <guid>https://rmoff.net/2025/11/25/ai-smells-on-medium/</guid>
      <description>&lt;p&gt;As part of compiling the monthly &lt;a href=&#34;https://rmoff.net/categories/interesting-links/&#34;&gt;interesting links&lt;/a&gt; posts, I go through a ton of RSS feeds, sourced from specific blogs that I follow as well as general aggregators.&#xA;These aggregators include quality sources like InfoQ, and certain tags on lobste.rs.&#xA;Here I&amp;rsquo;ll often find some good articles that I missed in my general travels around the social media feeds in the previous month.&#xA;I also, so you don&amp;rsquo;t have to, dive into the AI slop-pit that is Medium and various categories feeds.&#xA;In amongst the detritus and sewage of LLMs left to ramble unchecked are the occasional proverbial diamonds in the rough, which make the sifting worth the effort.&lt;/p&gt;&#xA;&lt;p&gt;I thought it might be interesting—and a useful vent to preserve my sanity—to note down some of the &amp;ldquo;smells&amp;rdquo; I&amp;rsquo;ve noticed.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Stumbling into AI: Part 6—I&#39;ve been thinking about Agents and MCP all wrong</title>
      <link>https://rmoff.net/2025/11/20/ive-been-thinking-about-agents-and-mcp-all-wrong/</link>
      <pubDate>Thu, 20 Nov 2025 03:42:10 +0000</pubDate>
      <guid>https://rmoff.net/2025/11/20/ive-been-thinking-about-agents-and-mcp-all-wrong/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Ever tried to hammer a nail in with a potato?&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Nor me, but that’s what I’ve felt like I’ve been attempting to do when trying to really understand agents, as well as to come up with an example agent to build.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;As I wrote about &lt;a href=&#34;https://rmoff.net/2025/10/06/stumbling-into-ai-part-5agents/&#34;&gt;previously&lt;/a&gt;, citing Simon Willison, &lt;em&gt;&lt;strong&gt;an LLM agent runs tools in a loop to achieve a goal&lt;/strong&gt;&lt;/em&gt;.&#xA;Unlike building ETL/ELT pipelines, these were some new concepts that I was struggling to fit to an even semi-plausible real world example.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;That’s because I was thinking about it all wrong.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How we built the demo for the Current NOLA Day 2 keynote using Flink and AI</title>
      <link>https://rmoff.net/2025/11/06/how-we-built-the-demo-for-the-current-nola-day-2-keynote-using-flink-and-ai/</link>
      <pubDate>Thu, 06 Nov 2025 14:20:08 +0000</pubDate>
      <guid>https://rmoff.net/2025/11/06/how-we-built-the-demo-for-the-current-nola-day-2-keynote-using-flink-and-ai/</guid>
      <description>&lt;p&gt;At Current 2025 in New Orleans this year we built a demo for the &lt;a href=&#34;https://www.youtube.com/watch?v=q05yqzDcSCI&#34;&gt;Day 2 keynote&lt;/a&gt; that would automagically summarise what was happening in the room, as reported by members of the audience.&#xA;Here&amp;rsquo;s how we did it!&lt;/p&gt;</description>
    </item>
    <item>
      <title>Tech Radar (Nov 2025) - data blips</title>
      <link>https://rmoff.net/2025/11/05/tech-radar-nov-2025-data-blips/</link>
      <pubDate>Wed, 05 Nov 2025 09:53:26 +0000</pubDate>
      <guid>https://rmoff.net/2025/11/05/tech-radar-nov-2025-data-blips/</guid>
      <description>&lt;p&gt;The latest &lt;a href=&#34;https://www.thoughtworks.com/radar&#34;&gt;Thoughtworks TechRadar&lt;/a&gt; is out.&#xA;Here are some of the more data-related &amp;lsquo;blips&amp;rsquo; (as they&amp;rsquo;re called on the radar) that I noticed.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Interesting links - October 2025</title>
      <link>https://rmoff.net/2025/10/31/interesting-links-october-2025/</link>
      <pubDate>Fri, 31 Oct 2025 13:28:23 +0000</pubDate>
      <guid>https://rmoff.net/2025/10/31/interesting-links-october-2025/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;What with Current NOLA 2025 happening this week, and some &lt;em&gt;very&lt;/em&gt; last minute preparations for the demo at the keynote on day 2, this month’s links roundup is pushing it right up to the wire :)&#xA;The demo was pretty cool, and finally I have a good example of how this AI stuff actually fits into a workflow ;)&#xA;I’ll write it up as a blog post (or two, probably)—stay tuned!&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Blog Writing for Developers</title>
      <link>https://rmoff.net/talk/blog-writing-for-developers/</link>
      <pubDate>Wed, 22 Oct 2025 16:00:00 +0000</pubDate>
      <guid>https://rmoff.net/talk/blog-writing-for-developers/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;A presentation about effective blog writing for developers, covering why to blog, what to write about, and how to structure your content.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Stumbling into AI: Part 5—Agents</title>
      <link>https://rmoff.net/2025/10/06/stumbling-into-ai-part-5agents/</link>
      <pubDate>Mon, 06 Oct 2025 12:47:17 +0000</pubDate>
      <guid>https://rmoff.net/2025/10/06/stumbling-into-ai-part-5agents/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;A &lt;a href=&#34;https://rmoff.net/categories/stumbling-into-ai&#34;&gt;short series&lt;/a&gt; of notes for myself as I learn more about the AI ecosystem as of Autumn [Fall] 2025.&lt;/em&gt;&#xA;&lt;em&gt;The driver for all this is understanding more about Apache Flink’s &lt;a href=&#34;https://github.com/apache/flink-agents&#34;&gt;&lt;strong&gt;Flink Agents&lt;/strong&gt;&lt;/a&gt; project, and Confluent’s &lt;a href=&#34;https://www.confluent.io/product/streaming-agents/&#34;&gt;&lt;strong&gt;Streaming Agents&lt;/strong&gt;&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I started off &lt;a href=&#34;https://rmoff.net/categories/stumbling-into-ai/&#34;&gt;this series&lt;/a&gt;—somewhat randomly, with hindsight—looking at &lt;a href=&#34;https://rmoff.net/2025/09/04/stumbling-into-ai-part-1mcp/&#34;&gt;Model Context Protocol (&lt;strong&gt;MCP&lt;/strong&gt;)&lt;/a&gt;.&#xA;It’s a helper technology to make things easier to use and provide a richer experience.&#xA;Next I tried to wrap my head around &lt;a href=&#34;https://rmoff.net/2025/09/08/stumbling-into-ai-part-2models/&#34;&gt;&lt;strong&gt;Models&lt;/strong&gt;&lt;/a&gt;—mostly LLMs, but also with an &lt;a href=&#34;https://rmoff.net/2025/09/08/stumbling-into-ai-part-2models/#_addendum_there_are_models_and_then_there_are_models_a_k_a_not_all_models_are_llms&#34;&gt;addendum&lt;/a&gt; discussing other types of model too.&#xA;Along the lines of MCP, &lt;a href=&#34;https://rmoff.net/2025/09/12/stumbling-into-ai-part-3rag/&#34;&gt;Retrieval Augmented Generation (&lt;strong&gt;RAG&lt;/strong&gt;)&lt;/a&gt; is another helper technology that on its own doesn’t do anything but combined with an LLM gives it added smarts.&#xA;I took a brief moment in part 4 to try and build a clearer understanding of &lt;a href=&#34;https://rmoff.net/2025/09/16/stumbling-into-ai-part-4terminology-tidy-up-and-a-little-rant/&#34;&gt;&lt;strong&gt;the difference between ML and AI&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;So whilst RAG and MCP combined make for a bunch of nice capabilities beyond models such as LLMs alone, what I’m really circling around here is what we can do when we combine all these things: &lt;strong&gt;Agents&lt;/strong&gt;!&#xA;But…what &lt;em&gt;is&lt;/em&gt; an Agent, both conceptually and in practice?&#xA;Let’s try and figure it out.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interesting links - September 2025</title>
      <link>https://rmoff.net/2025/09/30/interesting-links-september-2025/</link>
      <pubDate>Tue, 30 Sep 2025 07:04:31 +0000</pubDate>
      <guid>https://rmoff.net/2025/09/30/interesting-links-september-2025/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Sneaking it in &lt;em&gt;just&lt;/em&gt; before the end of the month!&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;It’s a bumper set of links this month—I started with an original backlog of 125 links to get through.&#xA;Some fell by the wayside, but plenty of others (78, to be precise) made the cut.&#xA;With no further ado, let’s get cracking!&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Stumbling into AI: Part 4—Terminology Tidy-up (and a little rant)</title>
      <link>https://rmoff.net/2025/09/16/stumbling-into-ai-part-4terminology-tidy-up-and-a-little-rant/</link>
      <pubDate>Tue, 16 Sep 2025 13:38:43 +0000</pubDate>
      <guid>https://rmoff.net/2025/09/16/stumbling-into-ai-part-4terminology-tidy-up-and-a-little-rant/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Having looked at &lt;a href=&#34;https://rmoff.net/2025/09/04/stumbling-into-ai-part-1mcp/&#34;&gt;MCP&lt;/a&gt;, &lt;a href=&#34;https://rmoff.net/2025/09/08/stumbling-into-ai-part-2models/&#34;&gt;Models&lt;/a&gt;, and &lt;a href=&#34;https://rmoff.net/2025/09/12/stumbling-into-ai-part-3rag/&#34;&gt;RAG&lt;/a&gt;, I realised that I’ve been mentally skirting around something that I don’t really understand, so I’m going to expose myself to some ridicule here and try to understand better: what’s the difference between AI and ML? Aren’t they just the same?&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Stumbling into AI: Part 3—RAG</title>
      <link>https://rmoff.net/2025/09/12/stumbling-into-ai-part-3rag/</link>
      <pubDate>Fri, 12 Sep 2025 13:10:34 +0000</pubDate>
      <guid>https://rmoff.net/2025/09/12/stumbling-into-ai-part-3rag/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;A &lt;a href=&#34;https://rmoff.net/categories/stumbling-into-ai&#34;&gt;short series&lt;/a&gt; of notes for myself as I learn more about the AI ecosystem as of September 2025.&lt;/em&gt;&#xA;&lt;em&gt;The driver for all this is understanding more about Apache Flink’s &lt;a href=&#34;https://github.com/apache/flink-agents&#34;&gt;&lt;strong&gt;Flink Agents&lt;/strong&gt;&lt;/a&gt; project, and Confluent’s &lt;a href=&#34;https://www.confluent.io/product/streaming-agents/&#34;&gt;&lt;strong&gt;Streaming Agents&lt;/strong&gt;&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Having &lt;a href=&#34;https://rmoff.net/2025/09/04/stumbling-into-ai-part-1mcp/&#34;&gt;poked around MCP&lt;/a&gt; and &lt;a href=&#34;https://rmoff.net/2025/09/08/stumbling-into-ai-part-2models/&#34;&gt;Models&lt;/a&gt;, next up is RAG.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;RAG has been one of the buzzwords of the last couple of years, with any vendor worth its salt finding a way to crowbar it into their product.&#xA;I’d been sufficiently put off it by the hype to steer away from actually understanding what it is.&#xA;In this blog post, let’s fix that—because if I’ve understood it correctly, it’s a pattern that’s not scary at all.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Stumbling into AI: Part 2—Models</title>
      <link>https://rmoff.net/2025/09/08/stumbling-into-ai-part-2models/</link>
      <pubDate>Mon, 08 Sep 2025 08:10:34 +0000</pubDate>
      <guid>https://rmoff.net/2025/09/08/stumbling-into-ai-part-2models/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;A &lt;a href=&#34;https://rmoff.net/categories/stumbling-into-ai&#34;&gt;short series&lt;/a&gt; of notes for myself as I learn more about the AI ecosystem as of September 2025.&lt;/em&gt;&#xA;&lt;em&gt;The driver for all this is understanding more about Apache Flink’s &lt;a href=&#34;https://github.com/apache/flink-agents&#34;&gt;&lt;strong&gt;Flink Agents&lt;/strong&gt;&lt;/a&gt; project, and Confluent’s &lt;a href=&#34;https://www.confluent.io/product/streaming-agents/&#34;&gt;&lt;strong&gt;Streaming Agents&lt;/strong&gt;&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Having &lt;a href=&#34;https://rmoff.net/2025/09/04/stumbling-into-ai-part-1mcp/&#34;&gt;poked around MCP&lt;/a&gt; and got a broad idea of what it is, I want to next look at Models.&#xA;What used to be as simple as &amp;#34;&lt;em&gt;I used AI&lt;/em&gt;&amp;#34; actually boils down into several discrete areas, particularly when one starts looking at using LLMs beyond writing &lt;a href=&#34;https://rmoff.net/images/2025/09/13d0418e1ddd2f60eef260aa512cb2a27aed080a4702fd7f01e73ef7b8ba5c2b.webp&#34;&gt;a rap about Apache Kafka in the style of Monty Python&lt;/a&gt; and using it to build agents (like the Flink Agents that prompted this exploration in the first place).&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Stumbling into AI: Part 1—MCP</title>
      <link>https://rmoff.net/2025/09/04/stumbling-into-ai-part-1mcp/</link>
      <pubDate>Thu, 04 Sep 2025 09:10:34 +0000</pubDate>
      <guid>https://rmoff.net/2025/09/04/stumbling-into-ai-part-1mcp/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;A &lt;a href=&#34;https://rmoff.net/categories/stumbling-into-ai&#34;&gt;short series&lt;/a&gt; of notes for myself as I learn more about the AI ecosystem as of September 2025.&lt;/em&gt;&#xA;&lt;em&gt;The driver for all this is understanding more about Apache Flink’s &lt;a href=&#34;https://github.com/apache/flink-agents&#34;&gt;&lt;strong&gt;Flink Agents&lt;/strong&gt;&lt;/a&gt; project, and Confluent’s &lt;a href=&#34;https://www.confluent.io/product/streaming-agents/&#34;&gt;&lt;strong&gt;Streaming Agents&lt;/strong&gt;&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The first thing I want to understand better is MCP.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interesting links - August 2025</title>
      <link>https://rmoff.net/2025/08/21/interesting-links-august-2025/</link>
      <pubDate>Thu, 21 Aug 2025 16:33:14 +0000</pubDate>
      <guid>https://rmoff.net/2025/08/21/interesting-links-august-2025/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;You can find previous editions of &lt;em&gt;Interesting Links&lt;/em&gt; &lt;a href=&#34;https://rmoff.net/categories/interesting-links/&#34;&gt;here&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kafka to Iceberg - Exploring the Options</title>
      <link>https://rmoff.net/2025/08/18/kafka-to-iceberg-exploring-the-options/</link>
      <pubDate>Mon, 18 Aug 2025 13:43:31 +0000</pubDate>
      <guid>https://rmoff.net/2025/08/18/kafka-to-iceberg-exploring-the-options/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;You’ve got data in &lt;a href=&#34;https://www.youtube.com/watch?v=9CrlA0Wasvk&#34;&gt;Apache Kafka&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;You want to get that data into &lt;a href=&#34;https://www.youtube.com/watch?v=TsmhRZElPvM&#34;&gt;Apache Iceberg&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;What’s the best way to do it?&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2025/08/kafka-to-iceberg.png&#34; alt=&#34;kafka to iceberg&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Perhaps invariably, the answer is: &lt;strong&gt;IT DEPENDS&lt;/strong&gt;.&#xA;But fear not: here is a guide to help you navigate your way to choosing the best solution &lt;em&gt;for you&lt;/em&gt; 🫵.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Connecting Apache Flink SQL to Confluent Cloud Kafka broker</title>
      <link>https://rmoff.net/2025/07/22/connecting-apache-flink-sql-to-confluent-cloud-kafka-broker/</link>
      <pubDate>Tue, 22 Jul 2025 11:27:46 +0000</pubDate>
      <guid>https://rmoff.net/2025/07/22/connecting-apache-flink-sql-to-confluent-cloud-kafka-broker/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This is a quick blog post to remind me how to connect Apache Flink to a Kafka topic on Confluent Cloud.&#xA;You may wonder &lt;strong&gt;why&lt;/strong&gt; you’d want to do this, given that &lt;a href=&#34;https://www.confluent.io/en-gb/product/flink/&#34;&gt;&lt;strong&gt;Confluent Cloud for Apache Flink&lt;/strong&gt;&lt;/a&gt; is a much easier way to run Flink SQL.&#xA;But, for whatever reason, you’re here and you want to understand the necessary incantations to get this connectivity to work.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interesting links - July 2025</title>
      <link>https://rmoff.net/2025/07/18/interesting-links-july-2025/</link>
      <pubDate>Fri, 18 Jul 2025 14:38:04 +0000</pubDate>
      <guid>https://rmoff.net/2025/07/18/interesting-links-july-2025/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;First up, allow me a shameless plug for my blog posts this month:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://rmoff.net/2025/07/04/writing-to-apache-iceberg-on-s3-using-kafka-connect-with-glue-catalog/&#34;&gt;Writing to &lt;strong&gt;Apache Iceberg on S3 using Kafka Connect&lt;/strong&gt; with Glue catalog&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://rmoff.net/2025/07/14/keeping-your-data-lakehouse-in-order-table-maintenance-in-apache-iceberg/&#34;&gt;Keeping your Data Lakehouse in Order: &lt;strong&gt;Table Maintenance in Apache Iceberg&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;🔥 &lt;a href=&#34;https://www.confluent.io/blog/streaming-etl-flink-tableflow/&#34;&gt;Building Streaming Data Pipelines, Part 2: &lt;strong&gt;Data Processing and Enrichment with Flink SQL&lt;/strong&gt;&lt;/a&gt; (see also &lt;a href=&#34;https://www.confluent.io/blog/building-streaming-data-pipelines-part-1/&#34;&gt;Part 1&lt;/a&gt;)&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Keeping your Data Lakehouse in Order: Table Maintenance in Apache Iceberg</title>
      <link>https://rmoff.net/2025/07/14/keeping-your-data-lakehouse-in-order-table-maintenance-in-apache-iceberg/</link>
      <pubDate>Mon, 14 Jul 2025 14:43:04 +0000</pubDate>
      <guid>https://rmoff.net/2025/07/14/keeping-your-data-lakehouse-in-order-table-maintenance-in-apache-iceberg/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Iceberg nicely decouples storage from ingest and query (yay!).&#xA;When we say &amp;#34;decouples&amp;#34; it’s a fancy way of saying &amp;#34;doesn’t do&amp;#34;.&#xA;Which, in the case of ingest and query, is really powerful.&#xA;It means that we can store data in an open format, populated by one or more tools, and queried by the same, or other tools.&#xA;Iceberg gets to be very opinionated and optimised around what it was built for (storing tabular data in a flexible way that can be efficiently queried).&#xA;This is amazing!&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;But, what Iceberg doesn’t do is any housekeeping on its data and metadata.&#xA;This means that getting data in and out of Apache Iceberg isn’t where the story stops.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Writing to Apache Iceberg on S3 using Kafka Connect with Glue catalog</title>
      <link>https://rmoff.net/2025/07/04/writing-to-apache-iceberg-on-s3-using-kafka-connect-with-glue-catalog/</link>
      <pubDate>Fri, 04 Jul 2025 15:36:21 +0000</pubDate>
      <guid>https://rmoff.net/2025/07/04/writing-to-apache-iceberg-on-s3-using-kafka-connect-with-glue-catalog/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Without wanting to mix my temperature metaphors, Iceberg is the new hawtness, and getting data into it from other places is a common task.&#xA;I &lt;a href=&#34;https://rmoff.net/2025/06/24/writing-to-apache-iceberg-on-s3-using-flink-sql-with-glue-catalog/&#34;&gt;wrote previously about using Flink SQL to do this&lt;/a&gt;, and today I’m going to look at doing the same using Kafka Connect.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Kafka Connect can send data to Iceberg from any Kafka topic.&#xA;The source Kafka topic(s) can be populated by a Kafka Connect source connector (such as Debezium), or a regular application producing directly to it.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interesting links - June 2025</title>
      <link>https://rmoff.net/2025/06/27/interesting-links-june-2025/</link>
      <pubDate>Fri, 27 Jun 2025 12:42:59 +0000</pubDate>
      <guid>https://rmoff.net/2025/06/27/interesting-links-june-2025/</guid>
      <description></description>
    </item>
    <item>
      <title>Writing to Apache Iceberg on S3 using Flink SQL with Glue catalog</title>
      <link>https://rmoff.net/2025/06/24/writing-to-apache-iceberg-on-s3-using-flink-sql-with-glue-catalog/</link>
      <pubDate>Tue, 24 Jun 2025 17:12:50 +0000</pubDate>
      <guid>https://rmoff.net/2025/06/24/writing-to-apache-iceberg-on-s3-using-flink-sql-with-glue-catalog/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In this blog post I’ll show how you can use Flink SQL to write to Iceberg on S3, storing metadata about the Iceberg tables in the &lt;a href=&#34;https://docs.aws.amazon.com/glue/latest/dg/components-overview.html#data-catalog-intro&#34;&gt;AWS Glue Data Catalog&lt;/a&gt;.&#xA;First off, I’ll walk through the dependencies and a simple smoke-test, and then put it into practice using it to write data from a Kafka topic to Iceberg.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Digging into Ducklake</title>
      <link>https://rmoff.net/2025/06/02/digging-into-ducklake/</link>
      <pubDate>Mon, 02 Jun 2025 14:26:15 +0000</pubDate>
      <guid>https://rmoff.net/2025/06/02/digging-into-ducklake/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;After a week’s holiday (&amp;#34;vacation&amp;#34;, for y’all in the US) without a glance at anything work-related, what joy to return and find that the DuckDB folk have been busy, not only with &lt;a href=&#34;https://duckdb.org/2025/05/21/announcing-duckdb-130.html&#34;&gt;the recent 1.3.0 DuckDB release&lt;/a&gt;, but also a brand new project called &lt;a href=&#34;https://github.com/duckdb/ducklake&#34;&gt;DuckLake&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Here are my brief notes on DuckLake.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interesting links - May 2025</title>
      <link>https://rmoff.net/2025/05/23/interesting-links-may-2025/</link>
      <pubDate>Fri, 23 May 2025 10:02:42 +0000</pubDate>
      <guid>https://rmoff.net/2025/05/23/interesting-links-may-2025/</guid>
      <description></description>
    </item>
    <item>
      <title>Exploring Joins and Changelogs in Flink SQL</title>
      <link>https://rmoff.net/2025/05/20/exploring-joins-and-changelogs-in-flink-sql/</link>
      <pubDate>Tue, 20 May 2025 12:30:36 +0000</pubDate>
      <guid>https://rmoff.net/2025/05/20/exploring-joins-and-changelogs-in-flink-sql/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;strong&gt;SQL&lt;/strong&gt;. Three simple letters.&#xA;&lt;em&gt;Ess Queue Ell&lt;/em&gt;.&#xA;&lt;code&gt;/ˌɛs kjuː ˈɛl/&lt;/code&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In the data world they bind us together, yet separate us.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;As the saying goes, England and America are two countries divided by the same language, and the same goes for the batch and streaming world and some elements of SQL.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🏃🚶 The unofficial Current London 2025 Run/Walk 🏃🚶</title>
      <link>https://rmoff.net/2025/05/02/the-unofficial-current-london-2025-run/walk/</link>
      <pubDate>Fri, 02 May 2025 14:44:52 +0000</pubDate>
      <guid>https://rmoff.net/2025/05/02/the-unofficial-current-london-2025-run/walk/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;strong&gt;Another year, another Current—another 5k run/walk for anyone who’d like to join!&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>It&#39;s Time We Talked About Time: Exploring Watermarks (And More) In Flink SQL</title>
      <link>https://rmoff.net/2025/04/25/its-time-we-talked-about-time-exploring-watermarks-and-more-in-flink-sql/</link>
      <pubDate>Fri, 25 Apr 2025 15:26:56 +0000</pubDate>
      <guid>https://rmoff.net/2025/04/25/its-time-we-talked-about-time-exploring-watermarks-and-more-in-flink-sql/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Whether you’re processing data in batch or as a stream, the concept of time is an important part of accurate processing logic.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Because we process data after it happens, there are a minimum of two different types of time to consider:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;olist arabic&#34;&gt;&#xA;&lt;ol class=&#34;arabic&#34;&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;strong&gt;When it happened&lt;/strong&gt;, known as &lt;strong&gt;Event Time&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;strong&gt;When we process it&lt;/strong&gt;, known as &lt;strong&gt;Processing Time&lt;/strong&gt; (or &lt;em&gt;system time&lt;/em&gt; or &lt;em&gt;wall clock time&lt;/em&gt;)&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interesting links - April 2025</title>
      <link>https://rmoff.net/2025/04/22/interesting-links-april-2025/</link>
      <pubDate>Tue, 22 Apr 2025 10:02:42 +0000</pubDate>
      <guid>https://rmoff.net/2025/04/22/interesting-links-april-2025/</guid>
      <description></description>
    </item>
    <item>
      <title>Confluent Cloud for Apache Flink - Exploring the API</title>
      <link>https://rmoff.net/2025/03/25/confluent-cloud-for-apache-flink-exploring-the-api/</link>
      <pubDate>Tue, 25 Mar 2025 15:26:13 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/25/confluent-cloud-for-apache-flink-exploring-the-api/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://www.confluent.io/en-gb/blog/serverless-flink-confluent-cloud-generally-available/&#34;&gt;Confluent Cloud for Apache Flink&lt;/a&gt; gives you access to run Flink workloads using a serverless platform on Confluent Cloud.&#xA;After &lt;a href=&#34;https://rmoff.net/2025/03/13/creating-an-http-source-connector-on-confluent-cloud-from-the-cli/&#34;&gt;poking around the Confluent Cloud API for configuring connectors&lt;/a&gt; I wanted to take a look at the same for Flink.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Using the API is useful particularly if you want to script a deployment, or automate a bulk operation that might be tiresome to do otherwise.&#xA;It’s also handy if you just prefer living in the CLI :)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interesting links - March 2025</title>
      <link>https://rmoff.net/2025/03/24/interesting-links-march-2025/</link>
      <pubDate>Mon, 24 Mar 2025 10:01:56 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/24/interesting-links-march-2025/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;The problem with publishing &lt;a href=&#34;https://rmoff.net/2025/02/03/interesting-links-february-2025/&#34;&gt;February’s interesting links&lt;/a&gt; at the beginning of the month and now getting around to publishing March’s at the end is that I have nearly two months&amp;#39; worth of links to share 😅 So with no further ado, let’s crack on.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How to create Carousel posts in LinkedIn…without the bullshit</title>
      <link>https://rmoff.net/2025/03/21/how-to-create-carousel-posts-in-linkedinwithout-the-bullshit/</link>
      <pubDate>Fri, 21 Mar 2025 17:42:57 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/21/how-to-create-carousel-posts-in-linkedinwithout-the-bullshit/</guid>
      <description>&lt;div class=&#34;admonitionblock tip&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-tip&#34; title=&#34;Tip&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;tl;dr: Upload a PDF document in which each slide of the carousel is one page.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;hr/&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I wanted to post a Carousel post in LinkedIn, but had to wade through a million pages of crap in Google from companies trying to sell shit.&#xA;Here’s how to do it simply.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Building a data pipeline with DuckDB</title>
      <link>https://rmoff.net/2025/03/20/building-a-data-pipeline-with-duckdb/</link>
      <pubDate>Thu, 20 Mar 2025 10:01:56 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/20/building-a-data-pipeline-with-duckdb/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In this blog post I’m going to explore how as a data engineer in the field today I might go about putting together a rudimentary data pipeline.&#xA;I’ll take some operational data, and wrangle it into a form that makes it easily pliable for analytics work.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;After a somewhat fevered and nightmarish period during which people walked around declaring &amp;#34;Schema on Read&amp;#34; was the future, that &amp;#34;Data is the new oil&amp;#34;, and &amp;#34;Look at the size of my big data&amp;#34;, the path that is history in IT is somewhat coming back on itself to a more sensible approach to things.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;As they say:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;quoteblock&#34;&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;What’s old is new&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This is good news for me, because I am old and what I knew then is &amp;#39;new&amp;#39; now ;)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Exporting Notebooks from DuckDB UI</title>
      <link>https://rmoff.net/2025/03/19/exporting-notebooks-from-duckdb-ui/</link>
      <pubDate>Wed, 19 Mar 2025 17:01:56 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/19/exporting-notebooks-from-duckdb-ui/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;DuckDB added a very cool &lt;a href=&#34;https://duckdb.org/2025/03/12/duckdb-ui.html&#34;&gt;UI&lt;/a&gt; last week and &lt;a href=&#34;https://rmoff.net/2025/03/14/kicking-the-tyres-on-the-new-duckdb-ui/&#34;&gt;I’ve been using it&lt;/a&gt; as my primary interface to DuckDB since.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;One thing that bothered me was that the SQL I was writing in the notebooks wasn’t exportable.&#xA;Since DuckDB uses DuckDB in the background for storing notebooks, getting the SQL out is easy enough.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kicking the tyres on the new DuckDB UI</title>
      <link>https://rmoff.net/2025/03/14/kicking-the-tyres-on-the-new-duckdb-ui/</link>
      <pubDate>Fri, 14 Mar 2025 12:26:16 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/14/kicking-the-tyres-on-the-new-duckdb-ui/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I wrote a couple of weeks ago about &lt;a href=&#34;https://rmoff.net/2025/02/28/exploring-uk-environment-agency-data-in-duckdb-and-rill/&#34;&gt;using DuckDB and Rill Data&lt;/a&gt; to explore a new data source that I’m working with.&#xA;I wanted to understand the data’s structure and distribution of values, as well as how different entities related.&#xA;This week DuckDB 1.2.1 was released and that little 0.0.1 version boost brought with it the &lt;a href=&#34;https://duckdb.org/2025/03/12/duckdb-ui.html&#34;&gt;DuckDB UI&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Here I’ll go through the same process as I did before, and see how much of what I was doing can be done in DuckDB alone now.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Creating an HTTP Source connector on Confluent Cloud from the CLI</title>
      <link>https://rmoff.net/2025/03/13/creating-an-http-source-connector-on-confluent-cloud-from-the-cli/</link>
      <pubDate>Thu, 13 Mar 2025 11:29:40 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/13/creating-an-http-source-connector-on-confluent-cloud-from-the-cli/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In this blog article I’ll show you how you can use the &lt;a href=&#34;https://docs.confluent.io/confluent-cli/current/overview.html&#34;&gt;&lt;code&gt;confluent&lt;/code&gt; CLI&lt;/a&gt; to set up a Kafka cluster on Confluent Cloud, the necessary API keys, and then a managed connector.&#xA;The connector I’m setting up is the &lt;a href=&#34;https://docs.confluent.io/cloud/current/connectors/cc-http-source-v2.html&#34;&gt;HTTP Source (v2)&lt;/a&gt; connector.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2025/03/managed-http-connector.webp&#34; alt=&#34;managed http connector&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;It’s part of a pipeline that I’m working on to pull in &lt;a href=&#34;https://environment.data.gov.uk/flood-monitoring/doc/reference&#34;&gt;a feed of data from the UK Environment Agency&lt;/a&gt; for processing.&#xA;The data is spread across three endpoints, and one of the nice features of the HTTP Source (v2) connector is that one connector can pull data from more than one endpoint.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Why is kcat showing the wrong topics?</title>
      <link>https://rmoff.net/2025/03/13/why-is-kcat-showing-the-wrong-topics/</link>
      <pubDate>Thu, 13 Mar 2025 10:42:11 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/13/why-is-kcat-showing-the-wrong-topics/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Much as I love kcat (🤫 &lt;em&gt;it’ll always be kafkacat to me&lt;/em&gt;…), this morning I nearly fell out with it 👇&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;😖 I thought I was going stir crazy, after listing topics on a broker &lt;strong&gt;and seeing topics from a different broker&lt;/strong&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;😵 WTF 😵&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Write more blog articles, not fewer (Don&#39;t leave the scraps on the cutting floor)</title>
      <link>https://rmoff.net/2025/03/11/write-more-blog-articles-not-fewer-dont-leave-the-scraps-on-the-cutting-floor/</link>
      <pubDate>Tue, 11 Mar 2025 10:41:44 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/11/write-more-blog-articles-not-fewer-dont-leave-the-scraps-on-the-cutting-floor/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Some would say that the perfect blog article takes the reader on a journey on in which the development process looks like this:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2025/03/blog_content1.excalidraw.webp&#34; alt=&#34;blog content1.excalidraw&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Data Wrangling with Flink SQL</title>
      <link>https://rmoff.net/2025/03/10/data-wrangling-with-flink-sql/</link>
      <pubDate>Mon, 10 Mar 2025 16:57:44 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/10/data-wrangling-with-flink-sql/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The UK Government publishes a lot of its data as &lt;a href=&#34;https://www.data.gov.uk/&#34;&gt;open feeds&lt;/a&gt;.&#xA;One that I keep coming back to is the &lt;a href=&#34;https://environment.data.gov.uk/flood-monitoring/doc/reference&#34;&gt;Environment Agency’s flood-monitoring API&lt;/a&gt; that gives access to an estate of sensors that provide information about data such as river levels and rainfall.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The data is well-structured and provided across three primary API endpoints.&#xA;In this blog article I’m going to show you how I use Flink SQL to explore and wrangle these into the kind of form from which I am then going to build a streaming pipeline using them.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Joining two streams of data with Flink SQL</title>
      <link>https://rmoff.net/2025/03/06/joining-two-streams-of-data-with-flink-sql/</link>
      <pubDate>Thu, 06 Mar 2025 15:45:41 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/06/joining-two-streams-of-data-with-flink-sql/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;There was a useful question on the &lt;a href=&#34;https://flink.apache.org/what-is-flink/community/#slack&#34;&gt;Apache Flink Slack&lt;/a&gt; recently about joining data in Flink SQL:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;quoteblock&#34;&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;How can I join two streams of data by id in Flink, to get a combined view of the latest data?&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How to explode nested arrays with Flink SQL</title>
      <link>https://rmoff.net/2025/03/03/how-to-explode-nested-arrays-with-flink-sql/</link>
      <pubDate>Mon, 03 Mar 2025 14:41:44 +0000</pubDate>
      <guid>https://rmoff.net/2025/03/03/how-to-explode-nested-arrays-with-flink-sql/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Let’s imagine we’ve got a source of data with a nested array of multiple values.&#xA;The data is from an IoT device.&#xA;Each device has multiple sensors, each sensor provides a reading.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Exploring UK Environment Agency data in DuckDB and Rill</title>
      <link>https://rmoff.net/2025/02/28/exploring-uk-environment-agency-data-in-duckdb-and-rill/</link>
      <pubDate>Fri, 28 Feb 2025 10:02:33 +0000</pubDate>
      <guid>https://rmoff.net/2025/02/28/exploring-uk-environment-agency-data-in-duckdb-and-rill/</guid>
      <description>&lt;p&gt;The UK Environment Agency publishes &lt;a href=&#34;https://environment.data.gov.uk/flood-monitoring/doc/reference#api-summary&#34;&gt;a feed of data relating to rainfall and river levels&lt;/a&gt;. As a prelude to building a streaming pipeline with this data, I wanted to understand the model of it first.&lt;/p&gt;</description>
    </item>
    <item>
      <title>DuckDB tricks - renaming fields in a SELECT * across tables</title>
      <link>https://rmoff.net/2025/02/27/duckdb-tricks-renaming-fields-in-a-select-across-tables/</link>
      <pubDate>Thu, 27 Feb 2025 09:43:20 +0000</pubDate>
      <guid>https://rmoff.net/2025/02/27/duckdb-tricks-renaming-fields-in-a-select-across-tables/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I was exploring some new data, joining across multiple tables, and doing a simple &lt;code&gt;SELECT *&lt;/code&gt; as I’d not worked out yet which columns I actually wanted.&#xA;The issue was, the same field name existing in more than one table.&#xA;This meant that in the results from the query, it wasn’t clear which field came from which table:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Interesting links - February 2025</title>
      <link>https://rmoff.net/2025/02/03/interesting-links-february-2025/</link>
      <pubDate>Mon, 03 Feb 2025 20:49:01 +0000</pubDate>
      <guid>https://rmoff.net/2025/02/03/interesting-links-february-2025/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Here’s a bunch of interesting links and articles about data that I’ve come across recently.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Checkpoint Chronicle - December 2024</title>
      <link>https://rmoff.net/2024/12/19/checkpoint-chronicle-december-2024/</link>
      <pubDate>Thu, 19 Dec 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/12/19/checkpoint-chronicle-december-2024/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/checkpoint-chronicle-december-2024&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Welcome to the &lt;em&gt;Checkpoint Chronicle&lt;/em&gt;, a monthly roundup of interesting stuff in the data and streaming space.&#xA;Your hosts and esteemed curators of said content are  &lt;a href=&#34;https://twitter.com/gunnarmorling?lang=en&#34;&gt;Gunnar Morling&lt;/a&gt; ,  &lt;a href=&#34;https://twitter.com/rmoff/&#34;&gt;Robin Moffatt&lt;/a&gt;  (your editor-in-chief for this edition), and  &lt;a href=&#34;https://twitter.com/hpgrahsl&#34;&gt;Hans-Peter Grahsl&lt;/a&gt; .&#xA;Feel free to send our way any choice nuggets that you think we should feature in future editions.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Disabling Vale Linting Selectively in Asciidoc</title>
      <link>https://rmoff.net/2024/12/11/disabling-vale-linting-selectively-in-asciidoc/</link>
      <pubDate>Wed, 11 Dec 2024 12:05:07 +0000</pubDate>
      <guid>https://rmoff.net/2024/12/11/disabling-vale-linting-selectively-in-asciidoc/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’m a &lt;strong&gt;HUGE&lt;/strong&gt; fan of Docs as Code in general, and specifically tools like &lt;a href=&#34;https://vale.sh&#34;&gt;Vale&lt;/a&gt; that lint your prose for adherence to style rule.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;One thing that had been bugging me though was how to selectively disable Vale for particular sections of a document.&#xA;Usually linting issues should be addressed at root: either fix the prose, or update the style rule. Either it’s a rule, or it’s not, right?&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Sometimes though I’ve found a need to make a particular exception to a rule, or simply needed to skip linting for a particular file.&#xA;I was struggling with how to do this in Asciidoc.&#xA;Despite &lt;a href=&#34;https://vale.sh/docs/topics/config/#asciidoc&#34;&gt;the documentation&lt;/a&gt; showing how to, I could never get it to work reliably.&#xA;Now I’ve taken some time to dig into it, I think I’ve finally understood :)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Exploring Flink CDC</title>
      <link>https://rmoff.net/2024/12/11/exploring-flink-cdc/</link>
      <pubDate>Wed, 11 Dec 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/12/11/exploring-flink-cdc/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/exploring-flink-cdc&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Flink CDC is an interesting part of Apache Flink that I’ve been meaning to take a proper look at for some time now.&#xA;Originally created by Ververica in 2021 and called “CDC Connectors for Apache Flink”, it was  &lt;a href=&#34;https://www.ververica.com/blog/ververica-donates-flink-cdc-empowering-real-time-data-integration-for-the-community&#34;&gt;donated&lt;/a&gt;  to live under the Apache Flink project in April 2024.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Streaming Data from Postgres to Snowflake with CDC and Decodable</title>
      <link>https://rmoff.net/2024/11/19/streaming-data-from-postgres-to-snowflake-with-cdc-and-decodable/</link>
      <pubDate>Tue, 19 Nov 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/11/19/streaming-data-from-postgres-to-snowflake-with-cdc-and-decodable/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/streaming-data-from-postgres-to-snowflake&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In my  &lt;a href=&#34;https://rmoff.net/2024/10/15/why-do-i-need-cdc/&#34;&gt;last blog post&lt;/a&gt;  I looked at why you might need CDC.&#xA;In this post I’m going to put it into practice with probably the most common use case—extracting data from an operational transactional database to store somewhere else for analytics.&#xA;I’m going to show Postgres to Snowflake, but the pattern is the same for pretty much any combination, such as MySQL to BigQuery, SQL Server to Redshift, and so on.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Checkpoint Chronicle - October 2024</title>
      <link>https://rmoff.net/2024/10/30/checkpoint-chronicle-october-2024/</link>
      <pubDate>Wed, 30 Oct 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/10/30/checkpoint-chronicle-october-2024/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/checkpoint-chronicle-october-2024&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Welcome to the &lt;em&gt;Checkpoint Chronicle&lt;/em&gt;, a monthly roundup of interesting stuff in the data and streaming space.&#xA;Your hosts and esteemed curators of said content are  &lt;a href=&#34;https://twitter.com/gunnarmorling?lang=en&#34;&gt;Gunnar Morling&lt;/a&gt; ,  &lt;a href=&#34;https://twitter.com/rmoff/&#34;&gt;Robin Moffatt&lt;/a&gt;  (your editor-in-chief for this edition), and  &lt;a href=&#34;https://twitter.com/hpgrahsl&#34;&gt;Hans-Peter Grahsl&lt;/a&gt; .&#xA;Feel free to send our way any choice nuggets that you think we should feature in future editions.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Why Do I Need CDC?</title>
      <link>https://rmoff.net/2024/10/15/why-do-i-need-cdc/</link>
      <pubDate>Tue, 15 Oct 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/10/15/why-do-i-need-cdc/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/why-do-i-need-cdc&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Whether it’s &lt;em&gt;Understanding CDC&lt;/em&gt; or &lt;em&gt;CDC Explained&lt;/em&gt; or even &lt;em&gt;Five things about CDC; number four will shock you!&lt;/em&gt;, the internet is awash with articles about  &lt;a href=&#34;https://en.wikipedia.org/wiki/Change_data_capture&#34;&gt;Change Data Capture (CDC)&lt;/a&gt; .&#xA;CDC is the process of incrementally extracting data change events as they occur within a database.&#xA;In this post I’d like to take a step back from these and look at the reasons why you might even want to consider CDC in the first place.&#xA;From here we’ll then build on the requirements we identify to come up with a proposed solution.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Checkpoint Chronicle - September 2024</title>
      <link>https://rmoff.net/2024/09/26/checkpoint-chronicle-september-2024/</link>
      <pubDate>Thu, 26 Sep 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/09/26/checkpoint-chronicle-september-2024/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/checkpoint-chronicle-september-2024&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Welcome to the &lt;em&gt;Checkpoint Chronicle&lt;/em&gt;, a monthly roundup of interesting stuff in the data and streaming space.&#xA;Your hosts and esteemed curators of said content are  &lt;a href=&#34;https://twitter.com/gunnarmorling?lang=en&#34;&gt;Gunnar Morling&lt;/a&gt; ,  &lt;a href=&#34;https://twitter.com/rmoff/&#34;&gt;Robin Moffatt&lt;/a&gt;  (your editor-in-chief for this edition), and  &lt;a href=&#34;https://twitter.com/hpgrahsl&#34;&gt;Hans-Peter Grahsl&lt;/a&gt; .&#xA;Feel free to send our way any choice nuggets that you think we should feature in future editions.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Current 2024 Recap</title>
      <link>https://rmoff.net/2024/09/19/current-2024-recap/</link>
      <pubDate>Thu, 19 Sep 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/09/19/current-2024-recap/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/current-2024-recap&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Launched in 2022, the Current conference hosted by Confluent has established itself as one of the leading conferences in the data streaming space.&#xA;Stemming from Kafka Summit originally, it’s broadened to reflect Confluent’s product portfolio including most notably Apache Flink.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Current 2024 - 5k Fun Run (or Walk)</title>
      <link>https://rmoff.net/2024/09/02/current-2024-5k-fun-run-or-walk/</link>
      <pubDate>Mon, 02 Sep 2024 15:11:42 +0000</pubDate>
      <guid>https://rmoff.net/2024/09/02/current-2024-5k-fun-run-or-walk/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;At &lt;a href=&#34;https://current.confluent.io/&#34;&gt;Current 24&lt;/a&gt; a few of us will be going for an early run (or walk) on Tuesday morning. Everyone is very welcome!&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Adventures with Apache Flink and Delta Lake</title>
      <link>https://rmoff.net/2024/08/27/adventures-with-apache-flink-and-delta-lake/</link>
      <pubDate>Tue, 27 Aug 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/08/27/adventures-with-apache-flink-and-delta-lake/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/adventures-with-apache-flink-and-delta-lake&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://delta.io/&#34;&gt;Delta Lake&lt;/a&gt;  (or Delta, as it’s often shortened to) is an open-source project from the Linux Foundation that’s primarily backed by Databricks.&#xA;It’s an open table format (OTF) similar in concept to Apache Iceberg and Apache Hudi.&#xA;Having  &lt;a href=&#34;https://www.decodable.co/blog-author/robin-moffatt&#34;&gt;previously&lt;/a&gt;  dug into using Iceberg with both  &lt;a href=&#34;https://rmoff.net/2024/07/18/sending-data-to-apache-iceberg-from-apache-kafka-with-apache-flink/&#34;&gt;Apache Flink&lt;/a&gt;  and  &lt;a href=&#34;https://rmoff.net/2024/06/18/how-to-get-data-from-apache-kafka-to-apache-iceberg-on-s3-with-decodable/&#34;&gt;Decodable&lt;/a&gt; , I wanted to see what it was like to use Delta with Flink—and specifically, Flink SQL.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Declarative Resource Management for Real-time ETL with Decodable</title>
      <link>https://rmoff.net/2024/08/14/declarative-resource-management-for-real-time-etl-with-decodable/</link>
      <pubDate>Wed, 14 Aug 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/08/14/declarative-resource-management-for-real-time-etl-with-decodable/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/declarative-resource-management&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;So you’ve built your first real-time ETL pipeline with Decodable: congratulations!&#xA;Now what?&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Troubleshooting Flink SQL S3 problems</title>
      <link>https://rmoff.net/2024/08/06/troubleshooting-flink-sql-s3-problems/</link>
      <pubDate>Tue, 06 Aug 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/08/06/troubleshooting-flink-sql-s3-problems/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/troubleshooting-flink-sql-s3-problems&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;You’d think once was enough.&#xA;Having  &lt;a href=&#34;https://rmoff.net/2024/04/17/flink-sqlmisconfiguration-misunderstanding-and-mishaps/&#34;&gt;already written&lt;/a&gt;  about the trouble that I had getting Flink SQL to write to S3 (including on MinIO) this should now be a moot issue for me.&#xA;Right?&#xA;RIGHT?!&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How to Migrate from Amazon MSF</title>
      <link>https://rmoff.net/2024/07/30/how-to-migrate-from-amazon-msf/</link>
      <pubDate>Tue, 30 Jul 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/07/30/how-to-migrate-from-amazon-msf/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/migrating-apache-flink-jobs-from-amazon-msf-to-decodable&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Amazon Managed Service for Apache Flink (MSF) is one of several providers of hosted Flink.&#xA;As my colleague Gunnar Morling described in  &lt;a href=&#34;https://www.decodable.co/blog/your-first-apache-flink-job&#34;&gt;his recent article&lt;/a&gt; , it can be used to run a Flink job that you’ve written in Java or Python (PyFlink).&#xA;But did you know that this isn’t the only way—or perhaps even the best way—to have your Flink jobs run for you?&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Sending Data to Apache Iceberg from Apache Kafka with Apache Flink</title>
      <link>https://rmoff.net/2024/07/18/sending-data-to-apache-iceberg-from-apache-kafka-with-apache-flink/</link>
      <pubDate>Thu, 18 Jul 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/07/18/sending-data-to-apache-iceberg-from-apache-kafka-with-apache-flink/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/kafka-to-iceberg-with-flink&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;Sometimes it’s not possible to have too much of a good thing, and whilst this blog may look at first-glance rather similar to the one that&lt;/em&gt; &lt;a href=&#34;https://rmoff.net/2024/06/18/how-to-get-data-from-apache-kafka-to-apache-iceberg-on-s3-with-decodable/&#34;&gt;I published just recently&lt;/a&gt; &lt;em&gt;, today we’re looking at a 100% pure Apache solution.&lt;/em&gt;&#xA;&lt;em&gt;Because who knows, maybe you prefer rolling your own tech stacks instead of letting Decodable do it for you 😉.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Decodable vs. Amazon MSF: Getting Started with Flink SQL</title>
      <link>https://rmoff.net/2024/07/02/decodable-vs.-amazon-msf-getting-started-with-flink-sql/</link>
      <pubDate>Tue, 02 Jul 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/07/02/decodable-vs.-amazon-msf-getting-started-with-flink-sql/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/decodable-vs-msf-getting-started-with-flink-sql&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;One of the things that I love about SQL is the power that it gives you to work with data in a declarative manner.&#xA;I want this thing…go do it.&#xA;How should it do it?&#xA;Well that’s the problem for the particular engine, not me.&#xA;As a language with a pedigree of multiple decades and no sign of waning (despite a wobbly patch for some whilst NoSQL figured out they actually wanted to be NewSQL 😉), it’s the &lt;em&gt;lingua franca&lt;/em&gt; of data systems.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How to get data from Apache Kafka to Apache Iceberg on S3 with Decodable</title>
      <link>https://rmoff.net/2024/06/18/how-to-get-data-from-apache-kafka-to-apache-iceberg-on-s3-with-decodable/</link>
      <pubDate>Tue, 18 Jun 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/06/18/how-to-get-data-from-apache-kafka-to-apache-iceberg-on-s3-with-decodable/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/kafka-to-iceberg-with-decodable&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://iceberg.apache.org/&#34;&gt;Apache Iceberg&lt;/a&gt;  is an open table format.&#xA;It combines the benefits of data lakes (open standards, cheap object storage) with the good things that data warehouses have, like first-class support for tables and SQL capabilities including updates to data in place, time-travel, and transactions.&#xA;With the recent  &lt;a href=&#34;https://www.databricks.com/company/newsroom/press-releases/databricks-agrees-acquire-tabular-company-founded-original-creators&#34;&gt;acquisition&lt;/a&gt;  by Databricks of Tabular—one of the main companies that contribute to Iceberg—it’s clear that Iceberg is winning out as one of the primary contenders in this space.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Checkpoint Chronicle - May 2024</title>
      <link>https://rmoff.net/2024/05/28/checkpoint-chronicle-may-2024/</link>
      <pubDate>Tue, 28 May 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/05/28/checkpoint-chronicle-may-2024/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/checkpoint-chronicle-may-2024&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Welcome to the &lt;em&gt;Checkpoint Chronicle&lt;/em&gt;, a monthly roundup of interesting stuff in the data and streaming space.&#xA;Your hosts and esteemed curators of said content are  &lt;a href=&#34;https://twitter.com/gunnarmorling?lang=en&#34;&gt;Gunnar Morling&lt;/a&gt;  and  &lt;a href=&#34;https://twitter.com/rmoff/&#34;&gt;Robin Moffatt&lt;/a&gt;  (your editor-in-chief for this edition).&#xA;Feel free to send our way any choice nuggets that you think we should feature in future editions.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How I Try To Keep Up With The Data Tech World (A List of Data Blogs)</title>
      <link>https://rmoff.net/2024/05/22/how-i-try-to-keep-up-with-the-data-tech-world-a-list-of-data-blogs/</link>
      <pubDate>Wed, 22 May 2024 13:19:10 +0000</pubDate>
      <guid>https://rmoff.net/2024/05/22/how-i-try-to-keep-up-with-the-data-tech-world-a-list-of-data-blogs/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I do my best to try and keep, if not abreast of, then at least aware of what’s going on in the world of data. That includes RDBMS, Event streaming, stream processing, open source data projects, data engineering, object storage, and more. If you’re interested in the same, then you might find this blog useful, because I’m sharing my sources :)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>ngrok DNS headaches</title>
      <link>https://rmoff.net/2024/05/03/ngrok-dns-headaches/</link>
      <pubDate>Fri, 03 May 2024 10:56:30 +0000</pubDate>
      <guid>https://rmoff.net/2024/05/03/ngrok-dns-headaches/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Let’s not bury the lede: it was DNS. However, unlike the meme (&lt;em&gt;&amp;#34;It’s not DNS, it’s never DNS. It was DNS&amp;#34;&lt;/em&gt;), I didn’t even have an inkling that DNS might be the problem.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’m writing a new blog about streaming Apache Kafka data to Apache Iceberg and wanted to provision a local Kafka cluster to pull data from remotely. I got this working nicely just last year using &lt;a href=&#34;https://rmoff.net/2023/11/01/using-apache-kafka-with-ngrok/&#34;&gt;ngrok to expose the broker to the interwebz&lt;/a&gt;, so figured I’d use this again. Simple, right?&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Nope.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How to stop AWS CLI clearing the screen</title>
      <link>https://rmoff.net/2024/04/26/how-to-stop-aws-cli-clearing-the-screen/</link>
      <pubDate>Fri, 26 Apr 2024 12:49:50 +0000</pubDate>
      <guid>https://rmoff.net/2024/04/26/how-to-stop-aws-cli-clearing-the-screen/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;After a break from using AWS I had reason to reacquaint myself with it again today, and did so via the CLI. The &lt;a href=&#34;https://aws.amazon.com/cli/&#34;&gt;AWS CLI&lt;/a&gt; is pretty intuitive and has a good helptext system, but one thing that kept frustrasting me was that after closing the help text, the screen cleared—so I couldn’t copy the syntax out to use in my command!&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The same thing happened when I ran a command that returned output - the screen cleared.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Here’s how to fix either, or both, of these&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Flink SQL—Misconfiguration, Misunderstanding, and Mishaps</title>
      <link>https://rmoff.net/2024/04/17/flink-sqlmisconfiguration-misunderstanding-and-mishaps/</link>
      <pubDate>Wed, 17 Apr 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/04/17/flink-sqlmisconfiguration-misunderstanding-and-mishaps/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/flink-sql-misconfiguration-misunderstanding-and-mishaps&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I never meant to write this blog.&#xA;I had a whole blog series about Flink SQL lined up…and then I started to write it and realised rapidly that one’s initial exposure to Flink and Flink SQL can be somewhat, shall we say, &lt;em&gt;interesting&lt;/em&gt;.&#xA;Interesting, as in the curse, &amp;#34;may you live in interesting times&amp;#34;.&#xA;Because as wonderful and as powerful Flink is, it is not a simple beast to run for yourself, even as a humble developer just trying to try out some SQL.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Checkpoint Chronicle - March 2024</title>
      <link>https://rmoff.net/2024/03/22/checkpoint-chronicle-march-2024/</link>
      <pubDate>Fri, 22 Mar 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/03/22/checkpoint-chronicle-march-2024/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/checkpoint-chronicle-march-2024&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Welcome to the &lt;em&gt;Checkpoint Chronicle&lt;/em&gt;, a monthly roundup of interesting stuff in the data and streaming space.&#xA;Your hosts and esteemed curators of said content are  &lt;a href=&#34;https://twitter.com/gunnarmorling?lang=en&#34;&gt;Gunnar Morling&lt;/a&gt;  and  &lt;a href=&#34;https://twitter.com/rmoff/&#34;&gt;Robin Moffatt&lt;/a&gt;  (your editor-in-chief for this edition).&#xA;Feel free to send our way any choice nuggets that you think we should feature in future editions.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🏃🚶 The unofficial Kafka Summit London 2024  Run/Walk 🏃🚶</title>
      <link>https://rmoff.net/2024/03/15/the-unofficial-kafka-summit-london-2024-run/walk/</link>
      <pubDate>Fri, 15 Mar 2024 15:00:05 +0000</pubDate>
      <guid>https://rmoff.net/2024/03/15/the-unofficial-kafka-summit-london-2024-run/walk/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;At this year’s Kafka Summit I’m planning to continue the tradition of going for a run (or walk) with anyone who’d like to join in. This started back at Kafka Summit San Francisco in 2019 &lt;a href=&#34;https://twitter.com/rmoff/status/1179047181891883008&#34;&gt;over the Golden Gate Bridge&lt;/a&gt; and has continued since then. Whilst London’s Docklands might not offer &lt;em&gt;quite&lt;/em&gt; the same experience it’ll be fun nonetheless.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Apache Flink talks at Kafka Summit London 2024</title>
      <link>https://rmoff.net/2024/03/15/apache-flink-talks-at-kafka-summit-london-2024/</link>
      <pubDate>Fri, 15 Mar 2024 13:54:39 +0000</pubDate>
      <guid>https://rmoff.net/2024/03/15/apache-flink-talks-at-kafka-summit-london-2024/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This year Kafka Summit London includes a dedicated track for talks about Apache Flink. This reflects the continued rise of interest and use of Apache Flink in the streaming community, as well as the focus that Confluent (the hosts of Kafka Summit) has on it.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’m looking forward to being back at Kafka Summit. I will be speaking on Tuesday afternoon, room hosting on Wednesday morning, and hanging out at &lt;a href=&#34;https://www.decodable.co/blog/meet-decodable-at-kafka-summit-london-2024&#34;&gt;the Decodable booth&lt;/a&gt; in between too.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Here’s a list of all the Flink talks, including the talk, time, and speaker. You find find more details, and the full Kafka Summit agenda, &lt;a href=&#34;https://events.bizzabo.com/559905/agenda&#34;&gt;here&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Exploring the Flink SQL Gateway REST API</title>
      <link>https://rmoff.net/2024/03/12/exploring-the-flink-sql-gateway-rest-api/</link>
      <pubDate>Tue, 12 Mar 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/03/12/exploring-the-flink-sql-gateway-rest-api/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/exploring-the-flink-sql-gateway-rest-api&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The &lt;a href=&#34;https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/sql-gateway/overview/&#34;&gt;SQL Gateway in Apache Flink&lt;/a&gt;  provides a way to run SQL in Flink from places other than the SQL Client.&#xA;This includes using a &lt;a href=&#34;link:/2023/11/16/learning-apache-flink-s01e06-the-flink-jdbc-driver/&#34;&gt;JDBC Driver&lt;/a&gt;  (which opens up a multitude of clients), a Hive client via the &lt;a href=&#34;https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/sql-gateway/hiveserver2/&#34;&gt;HiveServer2 endpoint&lt;/a&gt; , and directly against the &lt;a href=&#34;https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/sql-gateway/rest/&#34;&gt;REST Endpoint&lt;/a&gt; .&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Flink SQL and the Joy of JARs</title>
      <link>https://rmoff.net/2024/02/27/flink-sql-and-the-joy-of-jars/</link>
      <pubDate>Tue, 27 Feb 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/02/27/flink-sql-and-the-joy-of-jars/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/flink-sql-and-the-joy-of-jars&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I will wager you half of my lottery winnings from 2023[1] that you’re going to encounter this lovely little error at some point on your Flink SQL journey:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Checkpoint Chronicle - February 2024</title>
      <link>https://rmoff.net/2024/02/22/checkpoint-chronicle-february-2024/</link>
      <pubDate>Thu, 22 Feb 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/02/22/checkpoint-chronicle-february-2024/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/checkpoint-chronicle-february-2024&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Welcome to the &lt;em&gt;Checkpoint Chronicle&lt;/em&gt;, a monthly roundup of interesting stuff in the data and streaming space.&#xA;Your hosts and esteemed curators of said content are  &lt;a href=&#34;https://twitter.com/gunnarmorling?lang=en&#34;&gt;Gunnar Morling&lt;/a&gt;  and  &lt;a href=&#34;https://twitter.com/rmoff/&#34;&gt;Robin Moffatt&lt;/a&gt;  (your editor-in-chief for this edition).&#xA;Feel free to send our way any choice nuggets that you think we should feature in future editions.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Catalogs in Flink SQL—Hands On</title>
      <link>https://rmoff.net/2024/02/19/catalogs-in-flink-sqlhands-on/</link>
      <pubDate>Mon, 19 Feb 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/02/19/catalogs-in-flink-sqlhands-on/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/catalogs-in-flink-sql-hands-on&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In the  &lt;a href=&#34;https://rmoff.net/2024/02/16/catalogs-in-flink-sqla-primer/&#34;&gt;previous blog post&lt;/a&gt;  I looked at the role of catalogs in Flink SQL, the different types, and some of the quirks around their configuration and use.&#xA;If you are new to Flink SQL and catalogs, I would recommend reading that post just to make sure you’re not making some of the same assumptions that I mistakenly did when looking at this for the first time.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Catalogs in Flink SQL—A Primer</title>
      <link>https://rmoff.net/2024/02/16/catalogs-in-flink-sqla-primer/</link>
      <pubDate>Fri, 16 Feb 2024 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2024/02/16/catalogs-in-flink-sqla-primer/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/catalogs-in-flink-sql-a-primer&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;When you’re using Flink SQL you’ll run queries that interact with objects.&#xA;An &lt;code&gt;INSERT&lt;/code&gt; against a &lt;code&gt;TABLE&lt;/code&gt;, a &lt;code&gt;SELECT&lt;/code&gt; against a &lt;code&gt;VIEW&lt;/code&gt;, for example.&#xA;These objects are defined using DDL—but where do these definitions live?&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Antora Deployment to Cloudflare Across Private Repositories with GitHub Actions</title>
      <link>https://rmoff.net/2024/01/17/antora-deployment-to-cloudflare-across-private-repositories-with-github-actions/</link>
      <pubDate>Wed, 17 Jan 2024 12:09:23 +0000</pubDate>
      <guid>https://rmoff.net/2024/01/17/antora-deployment-to-cloudflare-across-private-repositories-with-github-actions/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;At &lt;a href=&#34;https://decodable.co&#34;&gt;Decodable&lt;/a&gt; we migrated our docs platform onto &lt;a href=&#34;https://antora.org/&#34;&gt;Antora&lt;/a&gt;. I wrote &lt;a href=&#34;https://rmoff.net/2023/12/19/deploying-antora-with-github-actions-and-a-private-github-repo/&#34;&gt;previously&lt;/a&gt; about my escapades in getting cross-repository authentication working using &lt;a href=&#34;https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#types-of-personal-access-tokens&#34;&gt;Private Access Tokens&lt;/a&gt; (PAT). These are fine for just a single user, but they’re tied to that user, which isn’t a good practice for deployment in this case.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In this article I’ll show how to use GitHub Apps and Installation Access Tokens (IAT) instead, and go into some detail on how we’ve deployed Antora. Our GitHub repositories are private which makes it extra-gnarly.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Hosting on GitHub Pages? Watch out for Subdomain Hijacking</title>
      <link>https://rmoff.net/2024/01/16/hosting-on-github-pages-watch-out-for-subdomain-hijacking/</link>
      <pubDate>Tue, 16 Jan 2024 11:50:36 +0000</pubDate>
      <guid>https://rmoff.net/2024/01/16/hosting-on-github-pages-watch-out-for-subdomain-hijacking/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;A friend messaged me late last night with the scary news that Google had emailed him about a ton of spammy subdomains on his own domain.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2024/01/g1.webp&#34; alt=&#34;A list of spam domains as reported by Google&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;Any idea how this could have happened, he asked?&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>1️⃣🐝🏎️🦆 (1BRC in SQL with DuckDB)</title>
      <link>https://rmoff.net/2024/01/03/1%EF%B8%8F%E2%83%A3%EF%B8%8F-1brc-in-sql-with-duckdb/</link>
      <pubDate>Wed, 03 Jan 2024 12:12:32 +0000</pubDate>
      <guid>https://rmoff.net/2024/01/03/1%EF%B8%8F%E2%83%A3%EF%B8%8F-1brc-in-sql-with-duckdb/</guid>
      <description>&lt;p&gt;&lt;strong&gt;Why should the Java folk have all the fun?!&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;p&gt;My friend and colleague &lt;a href=&#34;https://twitter.com/gunnarmorling/&#34;&gt;Gunnar Morling&lt;/a&gt; &lt;a href=&#34;https://www.morling.dev/blog/one-billion-row-challenge/&#34;&gt;launched a fun challenge&lt;/a&gt; this week: how fast can you aggregate and summarise a billion rows of data? Cunningly named The One Billion Row Challenge (1BRC for short), it&amp;rsquo;s aimed at Java coders to look at new features in the language and optimisation techniques.&lt;/p&gt;&#xA;&lt;p&gt;Not being a Java coder myself, and seeing how the challenge has already unofficially spread to other communities &lt;a href=&#34;https://www.reddit.com/r/rust/comments/18ws370/optimizing_a_one_billion_row_challenge_in_rust/&#34;&gt;including Rust and Python&lt;/a&gt; I thought I&amp;rsquo;d join in the fun using what I know best: SQL.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Deploying Antora with GitHub Actions and a private GitHub repo</title>
      <link>https://rmoff.net/2023/12/19/deploying-antora-with-github-actions-and-a-private-github-repo/</link>
      <pubDate>Tue, 19 Dec 2023 13:35:19 +0000</pubDate>
      <guid>https://rmoff.net/2023/12/19/deploying-antora-with-github-actions-and-a-private-github-repo/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://antora.org/&#34;&gt;Antora&lt;/a&gt; is a modern documentation site generator with many nice features including sourcing documentation content from one or more separate git repositories. This means that your docs can be kept under source control (yay 🎉) and in sync with the code of the product that they are documenting (double yay 🎉🎉).&lt;/p&gt;&#xA;&lt;p&gt;As you would expect for a documentation tool, the &lt;a href=&#34;https://docs.antora.org/antora/latest/&#34;&gt;Antora documentation&lt;/a&gt; is thorough but there was one sharp edge involving GitHub that caught me out which I&amp;rsquo;ll detail here.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Checkpoint Chronicle - December 2023</title>
      <link>https://rmoff.net/2023/12/13/checkpoint-chronicle-december-2023/</link>
      <pubDate>Wed, 13 Dec 2023 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2023/12/13/checkpoint-chronicle-december-2023/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/checkpoint-chronicle-december-2023&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Welcome to the &lt;em&gt;Checkpoint Chronicle&lt;/em&gt;, a monthly roundup of interesting stuff in the data and streaming space.&#xA;Your hosts and esteemed curators of said content are  &lt;a href=&#34;https://twitter.com/gunnarmorling?lang=en&#34;&gt;Gunnar Morling&lt;/a&gt;  and  &lt;a href=&#34;https://twitter.com/rmoff/&#34;&gt;Robin Moffatt&lt;/a&gt; —feel free to send our way any choice nuggets that you think we should feature in future editions.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Productivity tools: AI Image Generators</title>
      <link>https://rmoff.net/2023/12/07/productivity-tools-ai-image-generators/</link>
      <pubDate>Thu, 07 Dec 2023 19:59:41 +0000</pubDate>
      <guid>https://rmoff.net/2023/12/07/productivity-tools-ai-image-generators/</guid>
      <description>&lt;p&gt;AI, what a load of hyped-up bollocks, right? Yet here I am, legit writing a blog about it and not for the clickbait but…&lt;em&gt;gasp&lt;/em&gt;…because it&amp;rsquo;s actually useful.&lt;/p&gt;&#xA;&lt;p&gt;Used correctly, it&amp;rsquo;s just like any other tool on your desktop. It helps you get stuff done quicker, better—or both.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Hugo not detecting changed pages on Mac</title>
      <link>https://rmoff.net/2023/11/16/hugo-not-detecting-changed-pages-on-mac/</link>
      <pubDate>Thu, 16 Nov 2023 15:27:22 +0000</pubDate>
      <guid>https://rmoff.net/2023/11/16/hugo-not-detecting-changed-pages-on-mac/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve used Hugo for my blog for several years now, and it&amp;rsquo;s great. One of the things I love about it is the fast build time coupled with it&amp;rsquo;s live-reload feature. Using this I can edit my source (Markdown or Asciidoc) in one window, hit save, and see the preview update in my browser window next to it pretty much instantaneously. For copy-editing, experimenting with images, etc this is really helpful.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Learning Apache Flink S01E06: The Flink JDBC Driver</title>
      <link>https://rmoff.net/2023/11/16/learning-apache-flink-s01e06-the-flink-jdbc-driver/</link>
      <pubDate>Thu, 16 Nov 2023 15:20:20 +0000</pubDate>
      <guid>https://rmoff.net/2023/11/16/learning-apache-flink-s01e06-the-flink-jdbc-driver/</guid>
      <description>&lt;p&gt;As a newcomer to Apache Flink one of the first things I did was join the &lt;a href=&#34;https://flink.apache.org/what-is-flink/community/#slack&#34;&gt;Slack community&lt;/a&gt; (which is vendor-neutral and controlled by the Flink PMC). At the moment I&amp;rsquo;m pretty much in full-time lurker mode, soaking up the kind of questions that people have and how they&amp;rsquo;re using Flink.&lt;/p&gt;&#xA;&lt;p&gt;One &lt;a href=&#34;https://apache-flink.slack.com/archives/C03G7LJTS2G/p1699672468626739&#34;&gt;question&lt;/a&gt; that caught my eye was from Marco Villalobos, in which he asked about the Flink JDBC driver and a &lt;code&gt;SQLDataException&lt;/code&gt;  he was getting with a particular datatype. Now, unfortunately, I have no idea about the answer to this question—but the idea of a JDBC driver through which Flink SQL could be run sounded like a fascinating path to follow after &lt;a href=&#34;https://rmoff.net/2023/10/10/learning-apache-flink-s01e04-a-partial-exploration-of-the-flink-sql-client/&#34;&gt;previously looking at the SQL Client&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Checkpoint Chronicle - November 2023</title>
      <link>https://rmoff.net/2023/11/14/checkpoint-chronicle-november-2023/</link>
      <pubDate>Tue, 14 Nov 2023 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2023/11/14/checkpoint-chronicle-november-2023/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This post originally appeared on the &lt;a href=&#34;https://www.decodable.co/blog/checkpoint-chronicle-november-2023&#34;&gt;Decodable blog&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Welcome to the &lt;em&gt;Checkpoint Chronicle&lt;/em&gt;, a monthly roundup of interesting stuff in the data and streaming space.&#xA;Your hosts and esteemed curators of said content are  &lt;a href=&#34;https://twitter.com/gunnarmorling?lang=en&#34;&gt;Gunnar Morling&lt;/a&gt;  and  &lt;a href=&#34;https://twitter.com/rmoff/&#34;&gt;Robin Moffatt&lt;/a&gt;  - feel free to send our way any choice nuggets that you think we should feature in future editions.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Using Apache Kafka with ngrok</title>
      <link>https://rmoff.net/2023/11/01/using-apache-kafka-with-ngrok/</link>
      <pubDate>Wed, 01 Nov 2023 10:07:58 +0000</pubDate>
      <guid>https://rmoff.net/2023/11/01/using-apache-kafka-with-ngrok/</guid>
      <description>&lt;p&gt;Sometimes you might want to access Apache Kafka that&amp;rsquo;s running on your local machine from another device not on the same network. I&amp;rsquo;m not sure I can think of a production use-case, but there are a dozen examples for sandbox, demo, and playground environments.&lt;/p&gt;&#xA;&lt;p&gt;In this post we&amp;rsquo;ll see how you can use &lt;a href=&#34;https://ngrok.com/&#34;&gt;ngrok&lt;/a&gt; to, in their words, &lt;code&gt;Put localhost on the internet&lt;/code&gt;. And specifically, your local Kafka broker on the internet.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Learning Apache Flink S01E05: Installing PyFlink (with some bumps along the way…)</title>
      <link>https://rmoff.net/2023/10/25/learning-apache-flink-s01e05-installing-pyflink-with-some-bumps-along-the-way/</link>
      <pubDate>Wed, 25 Oct 2023 15:27:22 +0000</pubDate>
      <guid>https://rmoff.net/2023/10/25/learning-apache-flink-s01e05-installing-pyflink-with-some-bumps-along-the-way/</guid>
      <description>&lt;p&gt;When I started &lt;a href=&#34;https://rmoff.net/categories/laf/&#34;&gt;my journey learning Apache Flink&lt;/a&gt; one of the things that several people expressed an interest in hearing more about was PyFlink.  This appeals to me too, because whilst Java is just something I don&amp;rsquo;t know and feels beyond me to try and learn, Python is something that I know enough of to at least hack my way around it. I&amp;rsquo;ve previously &lt;a href=&#34;https://rmoff.net/2022/09/16/data-engineering-in-2022-exploring-lakefs-with-jupyter-and-pyspark/&#34;&gt;had fun with PySpark&lt;/a&gt;, and whilst &lt;a href=&#34;https://rmoff.net/categories/flink-sql/&#34;&gt;Flink SQL&lt;/a&gt; will probably be one of my main focusses, I also want to get a feel for PyFlink.&lt;/p&gt;&#xA;&lt;p&gt;The first step to using PyFlink is installing it - which should be simple, right?&lt;/p&gt;</description>
    </item>
    <item>
      <title>Learning Apache Flink S01E04: A [Partial] Exploration of the Flink SQL Client</title>
      <link>https://rmoff.net/2023/10/10/learning-apache-flink-s01e04-a-partial-exploration-of-the-flink-sql-client/</link>
      <pubDate>Tue, 10 Oct 2023 16:27:22 +0000</pubDate>
      <guid>https://rmoff.net/2023/10/10/learning-apache-flink-s01e04-a-partial-exploration-of-the-flink-sql-client/</guid>
      <description>&lt;p&gt;So far I&amp;rsquo;ve plotted out a bit of a &lt;a href=&#34;https://rmoff.net/2023/09/29/learning-apache-flink-s01e01-where-do-i-start/&#34;&gt;map for my exploration&lt;/a&gt; of Apache Flink, looked at &lt;a href=&#34;https://rmoff.net/2023/10/02/learning-apache-flink-s01e02-what-is-flink/&#34;&gt;what  Flink &lt;em&gt;is&lt;/em&gt;&lt;/a&gt;, and &lt;a href=&#34;https://rmoff.net/2023/10/05/learning-apache-flink-s01e03-running-my-first-flink-cluster-and-application/&#34;&gt;run my first Flink application&lt;/a&gt;. Being an absolutely abysmal coder—but knowing a thing or two about SQL—I figure that Flink SQL is where my focus is going to lie (&lt;em&gt;I&amp;rsquo;m also intrigued by PyFlink, but that&amp;rsquo;s for another day…&lt;/em&gt;).&lt;/p&gt;</description>
    </item>
    <item>
      <title>Learning Apache Flink S01E03: Running my First Flink Cluster and Application</title>
      <link>https://rmoff.net/2023/10/05/learning-apache-flink-s01e03-running-my-first-flink-cluster-and-application/</link>
      <pubDate>Thu, 05 Oct 2023 14:29:02 +0000</pubDate>
      <guid>https://rmoff.net/2023/10/05/learning-apache-flink-s01e03-running-my-first-flink-cluster-and-application/</guid>
      <description>&lt;p&gt;🎉 I just ran my first Apache Flink cluster and application on it 🎉&lt;/p&gt;</description>
    </item>
    <item>
      <title>cd: string not in pwd</title>
      <link>https://rmoff.net/2023/10/04/cd-string-not-in-pwd/</link>
      <pubDate>Wed, 04 Oct 2023 15:36:35 +0000</pubDate>
      <guid>https://rmoff.net/2023/10/04/cd-string-not-in-pwd/</guid>
      <description>&lt;p&gt;A brief diversion from my &lt;a href=&#34;https://rmoff.net/categories/laf/&#34;&gt;journey learning Apache Flink&lt;/a&gt; to document an interesting &lt;code&gt;zsh&lt;/code&gt; oddity that briefly tripped me up:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-shell&#34; data-lang=&#34;shell&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;cd: string not in pwd: flink-1.17.1&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Apache Flink S01E02: What *is* Flink?</title>
      <link>https://rmoff.net/2023/10/02/learning-apache-flink-s01e02-what-is-flink/</link>
      <pubDate>Mon, 02 Oct 2023 15:12:14 +0000</pubDate>
      <guid>https://rmoff.net/2023/10/02/learning-apache-flink-s01e02-what-is-flink/</guid>
      <description>&lt;p&gt;My &lt;a href=&#34;https://rmoff.net/2023/09/29/learning-apache-flink-s01e01-where-do-i-start/&#34;&gt;journey&lt;/a&gt; with &lt;a href=&#34;https://flink.apache.org&#34;&gt;Apache Flink&lt;/a&gt; begins with an overview of &lt;em&gt;what Flink actually is&lt;/em&gt;.&lt;/p&gt;&#xA;&lt;p&gt;What better place to start than the &lt;a href=&#34;https://nightlies.apache.org/flink/flink-docs-release-1.17/#apache-flink-documentation&#34;&gt;Apache Flink website itself&lt;/a&gt;:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;strong&gt;Apache Flink&lt;/strong&gt; is a framework and distributed processing engine for stateful computations over &lt;em&gt;unbounded&lt;/em&gt; and &lt;em&gt;bounded&lt;/em&gt; data streams. Flink has been designed to run in &lt;em&gt;all common cluster environments&lt;/em&gt;, perform computations at &lt;em&gt;in-memory&lt;/em&gt; speed and at &lt;em&gt;any scale&lt;/em&gt;.&lt;/p&gt;&#xA;&lt;/blockquote&gt;</description>
    </item>
    <item>
      <title>Learning Apache Flink S01E01: Where Do I Start?</title>
      <link>https://rmoff.net/2023/09/29/learning-apache-flink-s01e01-where-do-i-start/</link>
      <pubDate>Fri, 29 Sep 2023 12:57:04 +0000</pubDate>
      <guid>https://rmoff.net/2023/09/29/learning-apache-flink-s01e01-where-do-i-start/</guid>
      <description>&lt;p&gt;Like a fortunate child on Christmas Day, I&amp;rsquo;ve got a brand new toy! A brand new—to me—open-source technology to unwrap, learn, and perhaps even aspire to master elements of within.&lt;/p&gt;&#xA;&lt;p&gt;I &lt;a href=&#34;https://rmoff.net/2023/09/21/an-itch-that-just-has-to-be-scratched-or-why-am-i-joining-decodable&#34;&gt;joined Decodable&lt;/a&gt; two weeks ago, and since &lt;a href=&#34;https://decodable.co/&#34;&gt;Decodable&lt;/a&gt; is built on top of &lt;a href=&#34;https://flink.apache.org&#34;&gt;Apache Flink&lt;/a&gt; it seems like a great time to learn it. After six years learning Apache Kafka and hearing about this &amp;ldquo;Flink&amp;rdquo; thing but—for better or worse—never investigating it, I now have the perfect opportunity to do so.&lt;/p&gt;</description>
    </item>
    <item>
      <title>An Itch That Just Has to Be Scratched… (Or, Why Am I Joining Decodable?)</title>
      <link>https://rmoff.net/2023/09/21/an-itch-that-just-has-to-be-scratched-or-why-am-i-joining-decodable/</link>
      <pubDate>Thu, 21 Sep 2023 14:25:39 +0000</pubDate>
      <guid>https://rmoff.net/2023/09/21/an-itch-that-just-has-to-be-scratched-or-why-am-i-joining-decodable/</guid>
      <description>&lt;p&gt;This week I joined &lt;a href=&#34;https://decodable.co&#34;&gt;Decodable&lt;/a&gt;. I&amp;rsquo;m grateful to my former colleagues at Treeverse for allowing me to &lt;a href=&#34;https://rmoff.net/2022/12/09/looking-forwards-and-looking-backwards/&#34;&gt;join them&lt;/a&gt; on the journey with &lt;a href=&#34;https://lakefs.io&#34;&gt;lakeFS&lt;/a&gt; - but something about the streaming world was too strong to resist 😁.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Blog Writing for Developers</title>
      <link>https://rmoff.net/2023/07/19/blog-writing-for-developers/</link>
      <pubDate>Wed, 19 Jul 2023 20:59:09 +0000</pubDate>
      <guid>https://rmoff.net/2023/07/19/blog-writing-for-developers/</guid>
      <description>&lt;p&gt;Writing is one of the most powerful forms of communication, and it’s useful in a multitude of roles and contexts. As a &lt;a href=&#34;https://rmoff.net&#34;&gt;blog-writing&lt;/a&gt;, &lt;a href=&#34;https://github.com/treeverse/lakeFS/pulls?q=is%3Apr+label%3Adocs+author%3Armoff+&#34;&gt;documentation-authoring&lt;/a&gt;, &lt;a href=&#34;https://twitter.com/rmoff/status/1587382202781913089&#34;&gt;twitter-shitposting&lt;/a&gt; DevEx engineer I spend a lot of my time writing. Recently, someone paid me a very nice compliment about a blog I’d written and asked how they could learn to write like me and what resources I’d recommend.&lt;/p&gt;&#xA;&lt;p&gt;Never one to miss a chance to write and share something, here’s my response to this :)&lt;/p&gt;</description>
    </item>
    <item>
      <title>What Does This DevEx Engineer Do?</title>
      <link>https://rmoff.net/2023/05/23/what-does-this-devex-engineer-do/</link>
      <pubDate>Tue, 23 May 2023 23:56:16 +0000</pubDate>
      <guid>https://rmoff.net/2023/05/23/what-does-this-devex-engineer-do/</guid>
      <description>&lt;p&gt;&lt;em&gt;This was originally titled more broadly &amp;ldquo;What Does &lt;em&gt;A&lt;/em&gt; DevEx Engineer Do&amp;rdquo;, but that made it into a far too tedious and long-winding etymological exploration of the discipline. Instead, I&amp;rsquo;m going to tell you what this particular instantiation of the entity does 😄&lt;/em&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Authoring Wordpress blogs in Markdown (with Google Docs for review)</title>
      <link>https://rmoff.net/2023/05/03/authoring-wordpress-blogs-in-markdown-with-google-docs-for-review/</link>
      <pubDate>Wed, 03 May 2023 08:59:17 +0000</pubDate>
      <guid>https://rmoff.net/2023/05/03/authoring-wordpress-blogs-in-markdown-with-google-docs-for-review/</guid>
      <description>&lt;p&gt;Wordpress still, to an extent, rules the blogging world. Its longevity is testament to…something about it ;) However, it&amp;rsquo;s not my favourite platform in which to write a blog by a long way. It doesn&amp;rsquo;t support Markdown to the extent that I want. Yes, I&amp;rsquo;ve tried the plugins; no, they didn&amp;rsquo;t do what I needed.&lt;/p&gt;&#xA;&lt;p&gt;I like to write all my content in a structured format - ideally &lt;a href=&#34;https://asciidoc.org/&#34;&gt;Asciidoc&lt;/a&gt;, but &lt;a href=&#34;https://rmoff.net/2017/09/12/what-is-markdown-and-why-is-it-awesome/&#34;&gt;I&amp;rsquo;ll settle for Markdown too&lt;/a&gt;. Here&amp;rsquo;s how I stayed [almost] sane whilst composing a blog in Markdown, reviewing it in Google Docs, and then publishing it in Wordpress in a non-lossy way.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Building Better Docs - Automating Jekyll Builds and Link Checking for PRs</title>
      <link>https://rmoff.net/2023/04/20/building-better-docs-automating-jekyll-builds-and-link-checking-for-prs/</link>
      <pubDate>Thu, 20 Apr 2023 08:54:11 +0000</pubDate>
      <guid>https://rmoff.net/2023/04/20/building-better-docs-automating-jekyll-builds-and-link-checking-for-prs/</guid>
      <description>&lt;p&gt;One of the most important ways that a project can help its developers is providing them good documentation. Actually, scratch that. &lt;em&gt;Great&lt;/em&gt; documentation.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Using Delta from pySpark - `java.lang.ClassNotFoundException: delta.DefaultSource`</title>
      <link>https://rmoff.net/2023/04/05/using-delta-from-pyspark-java.lang.classnotfoundexception-delta.defaultsource/</link>
      <pubDate>Wed, 05 Apr 2023 15:51:41 +0000</pubDate>
      <guid>https://rmoff.net/2023/04/05/using-delta-from-pyspark-java.lang.classnotfoundexception-delta.defaultsource/</guid>
      <description>&lt;p&gt;No great insights in this post, just something for folk who Google this error after me and don&amp;rsquo;t want to waste three hours chasing their tails… 😄&lt;/p&gt;</description>
    </item>
    <item>
      <title>Quickly Convert CSV to Parquet with DuckDB</title>
      <link>https://rmoff.net/2023/03/14/quickly-convert-csv-to-parquet-with-duckdb/</link>
      <pubDate>Tue, 14 Mar 2023 15:12:31 +0000</pubDate>
      <guid>https://rmoff.net/2023/03/14/quickly-convert-csv-to-parquet-with-duckdb/</guid>
      <description>&lt;p&gt;Here&amp;rsquo;s a neat little trick you can use with &lt;a href=&#34;https://duckdb.org/&#34;&gt;DuckDB&lt;/a&gt; to convert a CSV file into a Parquet file:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-sql&#34; data-lang=&#34;sql&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;COPY&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;(&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;SELECT&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;*&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#bbb&#34;&gt;&#x9;    &lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;FROM&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;read_csv(&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#39;~/data/source.csv&amp;#39;&lt;/span&gt;,AUTO_DETECT&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;TRUE&lt;/span&gt;))&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#bbb&#34;&gt;  &lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;TO&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#39;~/data/target.parquet&amp;#39;&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;(FORMAT&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#39;PARQUET&amp;#39;&lt;/span&gt;,&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;CODEC&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#39;ZSTD&amp;#39;&lt;/span&gt;);&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Making the move from Alfred to Raycast</title>
      <link>https://rmoff.net/2023/03/03/making-the-move-from-alfred-to-raycast/</link>
      <pubDate>Fri, 03 Mar 2023 23:14:06 +0000</pubDate>
      <guid>https://rmoff.net/2023/03/03/making-the-move-from-alfred-to-raycast/</guid>
      <description>&lt;p&gt;It all started with a tweet.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Aligning mismatched Parquet schemas in DuckDB</title>
      <link>https://rmoff.net/2023/03/03/aligning-mismatched-parquet-schemas-in-duckdb/</link>
      <pubDate>Fri, 03 Mar 2023 14:36:08 +0000</pubDate>
      <guid>https://rmoff.net/2023/03/03/aligning-mismatched-parquet-schemas-in-duckdb/</guid>
      <description>&lt;p&gt;What do you do when you want to query over multiple parquet files but the schemas don&amp;rsquo;t quite line up? Let&amp;rsquo;s find out 👇🏻&lt;/p&gt;</description>
    </item>
    <item>
      <title>Looking Forwards, and Looking Backwards</title>
      <link>https://rmoff.net/2022/12/09/looking-forwards-and-looking-backwards/</link>
      <pubDate>Fri, 09 Dec 2022 09:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2022/12/09/looking-forwards-and-looking-backwards/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;As we enter December and 2022 draws to a close, so does a significant chapter in my working career—later this month I’ll be leaving Confluent and onto pastures new.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;It’s nearly six years since I wrote a &lt;a href=&#34;https://rmoff.net/2017/03/10/time-for-a-change/&#34;&gt;&amp;#39;moving on&amp;#39; blog entry&lt;/a&gt;, and as well as sharing what I’ll be working on next (and why), I also want to reflect on how much I’ve benefited from my time at Confluent and particularly the people with whom I worked.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Data Engineering in 2022: ELT tools</title>
      <link>https://rmoff.net/2022/11/08/data-engineering-in-2022-elt-tools/</link>
      <pubDate>Tue, 08 Nov 2022 19:46:39 +0000</pubDate>
      <guid>https://rmoff.net/2022/11/08/data-engineering-in-2022-elt-tools/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In &lt;a href=&#34;https://rmoff.net/2022/09/14/stretching-my-legs-in-the-data-engineering-ecosystem-in-2022/&#34;&gt;my quest&lt;/a&gt; to bring myself up to date with where the data &amp;amp; analytics engineering world is at nowadays, I’m going to build on my exploration of the &lt;a href=&#34;https://rmoff.net/2022/09/14/data-engineering-in-2022-storage-and-access/&#34;&gt;storage and access&lt;/a&gt; technologies and look at the tools we use for loading and transforming data.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Data Engineering in 2022: Wrangling the feedback data from Current 22 with dbt</title>
      <link>https://rmoff.net/2022/10/24/data-engineering-in-2022-wrangling-the-feedback-data-from-current-22-with-dbt/</link>
      <pubDate>Mon, 24 Oct 2022 12:27:14 +0000</pubDate>
      <guid>https://rmoff.net/2022/10/24/data-engineering-in-2022-wrangling-the-feedback-data-from-current-22-with-dbt/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I started my dbt journey by &lt;a href=&#34;https://rmoff.net/2022/10/20/data-engineering-in-2022-exploring-dbt-with-duckdb/&#34;&gt;poking and pulling at the pre-built jaffle_shop demo running with DuckDB as its data store&lt;/a&gt;. Now I want to see if I can put it to use myself to wrangle the session feedback data that came in from &lt;a href=&#34;https://2022.currentevent.io/&#34;&gt;Current 2022&lt;/a&gt;. I’ve &lt;a href=&#34;https://rmoff.net/2022/10/14/current-22-session-analysis-with-duckdb-and-jupyter-notebook/&#34;&gt;analysed&lt;/a&gt; this already, but it struck me that a particular part of it would benefit from some tidying up - and be a good excuse to see what it’s like using dbt to do so.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Data Engineering in 2022: Exploring dbt with DuckDB</title>
      <link>https://rmoff.net/2022/10/20/data-engineering-in-2022-exploring-dbt-with-duckdb/</link>
      <pubDate>Thu, 20 Oct 2022 17:07:04 +0000</pubDate>
      <guid>https://rmoff.net/2022/10/20/data-engineering-in-2022-exploring-dbt-with-duckdb/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve been wanting to try out dbt for some time now, and a recent long-haul flight seemed like the obvious opportunity to do so. Except many of the tutorials with dbt that I found were based on using Cloud, and airplane WiFi is generally sucky or non-existant. Then I found the &lt;a href=&#34;https://github.com/dbt-labs/jaffle_shop_duckdb&#34;&gt;DuckDB-based demo of dbt&lt;/a&gt;, which seemed to fit the bill (🦆 geddit?!) perfectly, since DuckDB runs locally. In addition, &lt;a href=&#34;https://duckdb.org/&#34;&gt;DuckDB&lt;/a&gt; had appeared on my radar recently and I was keen to check it out.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Current 22 - Session Analysis with DuckDB and Jupyter Notebook</title>
      <link>https://rmoff.net/2022/10/14/current-22-session-analysis-with-duckdb-and-jupyter-notebook/</link>
      <pubDate>Fri, 14 Oct 2022 16:15:57 +0000</pubDate>
      <guid>https://rmoff.net/2022/10/14/current-22-session-analysis-with-duckdb-and-jupyter-notebook/</guid>
      <description>&lt;p&gt;At &lt;a href=&#34;https://2022.currentevent.io/&#34;&gt;Current 2022&lt;/a&gt; the audience was given the option to submit ratings. Here&amp;rsquo;s some analysis I&amp;rsquo;ve done on the raw data. It&amp;rsquo;s interesting to poke about it, and it also gave me an excuse to try using &lt;a href=&#34;https://duckdb.org/docs/guides/python/jupyter&#34;&gt;DuckDB in a notebook&lt;/a&gt;!&lt;/p&gt;</description>
    </item>
    <item>
      <title>Data Engineering in 2022: Architectures &amp; Terminology</title>
      <link>https://rmoff.net/2022/10/02/data-engineering-in-2022-architectures-terminology/</link>
      <pubDate>Sun, 02 Oct 2022 10:50:56 +0000</pubDate>
      <guid>https://rmoff.net/2022/10/02/data-engineering-in-2022-architectures-terminology/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This is one of those &lt;em&gt;you had to be there&lt;/em&gt; moments. If you come into the world of data and analytics engineering today, ELT is just what it is and is pretty much universally understood. But if you’ve been around for …&lt;em&gt;waves hands&lt;/em&gt;… longer than that, you might be confused by what people are calling ELT and ETL. Well, I was ✋.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Current 2022 - 5k Fun Run</title>
      <link>https://rmoff.net/2022/09/26/current-2022-5k-fun-run/</link>
      <pubDate>Mon, 26 Sep 2022 13:51:42 +0000</pubDate>
      <guid>https://rmoff.net/2022/09/26/current-2022-5k-fun-run/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;At &lt;a href=&#34;https://2022.currentevent.io/&#34;&gt;Current 22&lt;/a&gt; a few of us will be going for an early run on Tuesday morning. Everyone is very welcome!&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Data Engineering in 2022: Exploring LakeFS with Jupyter and PySpark</title>
      <link>https://rmoff.net/2022/09/16/data-engineering-in-2022-exploring-lakefs-with-jupyter-and-pyspark/</link>
      <pubDate>Fri, 16 Sep 2022 08:54:45 +0000</pubDate>
      <guid>https://rmoff.net/2022/09/16/data-engineering-in-2022-exploring-lakefs-with-jupyter-and-pyspark/</guid>
      <description>&lt;p&gt;With my &lt;a href=&#34;https://rmoff.net/2022/09/14/stretching-my-legs-in-the-data-engineering-ecosystem-in-2022/&#34;&gt;foray&lt;/a&gt; into the current world of data engineering I wanted to get my hands dirty with some of the tools and technologies I&amp;rsquo;d been reading about. The vehicle for this was trying to understand more about LakeFS, but along the way dabbling with PySpark and S3 (MinIO) too.&lt;/p&gt;&#xA;&lt;p&gt;I&amp;rsquo;d forgotten how amazingly useful notebooks are. It&amp;rsquo;s &lt;a href=&#34;https://www.rittmanmead.com/blog/2016/12/etl-offload-with-spark-and-amazon-emr-part-2-code-development-with-notebooks-and-docker/&#34;&gt;six years since I wrote about them last&lt;/a&gt; (and the last time I tried my hand at PySpark). This blog is basically the notebook, with some more annotations.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Data Engineering: Resources</title>
      <link>https://rmoff.net/2022/09/14/data-engineering-resources/</link>
      <pubDate>Wed, 14 Sep 2022 20:57:21 +0000</pubDate>
      <guid>https://rmoff.net/2022/09/14/data-engineering-resources/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;As I’ve been reading and exploring the current world of data engineering I’ve been adding links to my &lt;a href=&#34;https://raindrop.io/rmoff/data-engineering-23335742&#34;&gt;Raindrop.io collection&lt;/a&gt;, so check that out. In addition, below are some specific resources that I’d recommend.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Data Engineering in 2022: Storage and Access</title>
      <link>https://rmoff.net/2022/09/14/data-engineering-in-2022-storage-and-access/</link>
      <pubDate>Wed, 14 Sep 2022 17:07:04 +0000</pubDate>
      <guid>https://rmoff.net/2022/09/14/data-engineering-in-2022-storage-and-access/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In this article I look at where we store our analytical data, how we organise it, and how we enable access to it. I’m considering here potentially large volumes of data for access throughout an organisation. I’m not looking at data stores that are used for specific purposes (caches, low-latency analytics, graph etc).&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The article is &lt;a href=&#34;https://rmoff.net/2022/09/14/stretching-my-legs-in-the-data-engineering-ecosystem-in-2022/&#34;&gt;part of a series&lt;/a&gt; in which I explore the world of data engineering in 2022 and how it has changed from when I started my career in data warehousing 20+ years ago. Read the &lt;a href=&#34;https://rmoff.net/2022/09/14/stretching-my-legs-in-the-data-engineering-ecosystem-in-2022/&#34;&gt;introduction&lt;/a&gt; for more context and background.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Stretching my Legs in the Data Engineering Ecosystem in 2022</title>
      <link>https://rmoff.net/2022/09/14/stretching-my-legs-in-the-data-engineering-ecosystem-in-2022/</link>
      <pubDate>Wed, 14 Sep 2022 10:42:30 +0000</pubDate>
      <guid>https://rmoff.net/2022/09/14/stretching-my-legs-in-the-data-engineering-ecosystem-in-2022/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;For the past 5.5 years I’ve been head-down in the exciting area of stream processing and events, and I realised recently that the world of data and analytics that I worked in up to 2017 which was changing significantly back then (Big Data, y’all!) has evolved and, dare I say it, matured somewhat - and I’ve not necessarily kept up with it. In this series of posts you can follow along as I start to reacquaint myself with where it’s got to these days.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Customising the fields shown in Airtable&#39;s Calendar .ics export</title>
      <link>https://rmoff.net/2022/09/12/customising-the-fields-shown-in-airtables-calendar-.ics-export/</link>
      <pubDate>Mon, 12 Sep 2022 16:17:08 +0000</pubDate>
      <guid>https://rmoff.net/2022/09/12/customising-the-fields-shown-in-airtables-calendar-.ics-export/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://airtable.com&#34;&gt;Airtable&lt;/a&gt; is a rather wonderful tool. It powers &lt;a href=&#34;https://rmoff.net/2022/08/31/inside-the-sausage-factory-how-we-built-the-program-for-current-2022/&#34;&gt;the program creation backend process&lt;/a&gt; for Kafka Summit and Current. It does, however, have a few frustrating limitations - often where it feels like a feature was built on a Friday afternoon and they didn’t get chance to finish it before knocking off to head to the pub.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Inside the Sausage Factory: How we Built the Program for Current 2022</title>
      <link>https://rmoff.net/2022/08/31/inside-the-sausage-factory-how-we-built-the-program-for-current-2022/</link>
      <pubDate>Wed, 31 Aug 2022 12:14:09 +0000</pubDate>
      <guid>https://rmoff.net/2022/08/31/inside-the-sausage-factory-how-we-built-the-program-for-current-2022/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;If you’ve ever been to a conference, particularly as a speaker whose submitted a paper that may or may not have been accepted, you might wonder quite how conferences choose the talks that get accepted.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I had the privilege of chairing the program committee for Current and Kafka Summit this year and curating the final program for both. Here’s a glimpse behind the curtains of how we built the program for Current 2022. It was originally posted as a &lt;a href=&#34;https://twitter.com/rmoff/status/1549410161688813569&#34;&gt;thread on Twitter&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>⚡️ Writing an abstract for a lightning talk ⚡️</title>
      <link>https://rmoff.net/2022/08/31/%EF%B8%8F-writing-an-abstract-for-a-lightning-talk-%EF%B8%8F/</link>
      <pubDate>Wed, 31 Aug 2022 11:13:38 +0000</pubDate>
      <guid>https://rmoff.net/2022/08/31/%EF%B8%8F-writing-an-abstract-for-a-lightning-talk-%EF%B8%8F/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;(&lt;a href=&#34;https://twitter.com/rmoff/status/1544257707049467905&#34;&gt;src&lt;/a&gt;)&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Lightning talks are generally 5-10 minutes. As the name implies - they are quick!&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;A good lightning talk is not just your breakout talk condensed into a shorter time frame. You can’t simply deliver the same material faster, or the same material at a higher level, or the same material with a few bits left out&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How to Write a Good Tech Conference Abstract - Learn from the Mistakes of Others</title>
      <link>https://rmoff.net/2022/07/20/how-to-write-a-good-tech-conference-abstract-learn-from-the-mistakes-of-others/</link>
      <pubDate>Wed, 20 Jul 2022 08:58:38 +0000</pubDate>
      <guid>https://rmoff.net/2022/07/20/how-to-write-a-good-tech-conference-abstract-learn-from-the-mistakes-of-others/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Building the program for any conference is not an easy task. There will always be a speaker disappointed that their talk didn’t get in—or perhaps an audience who are disappointed that a particular talk &lt;em&gt;did&lt;/em&gt; get in. As the chair of the &lt;a href=&#34;https://www.confluent.io/en-gb/blog/introducing-current-2022-program-committee/&#34;&gt;program committee&lt;/a&gt; for &lt;a href=&#34;https://2022.currentevent.io/&#34;&gt;Current 22&lt;/a&gt; one of the things that I’ve found really useful in building out the program this time round are the comments that the program committee left against submissions as they reviewed them.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;There were some common patterns I saw, and I thought it would be useful to share these here. Perhaps you’re an aspiring conference speaker looking to understand what mistakes to avoid. Maybe you’re an existing speaker whose abstracts don’t get accepted as often as you’d like. Or perhaps you’re just curious as to what goes on behind the curtains :)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Remote-First Developer Advocacy</title>
      <link>https://rmoff.net/2022/04/07/remote-first-developer-advocacy/</link>
      <pubDate>Thu, 07 Apr 2022 21:19:48 +0000</pubDate>
      <guid>https://rmoff.net/2022/04/07/remote-first-developer-advocacy/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’m convinced that a developer advocate &lt;em&gt;can&lt;/em&gt; be effective remotely. As a profession, we’ve all spent two years figuring out how to do just that. Some of it worked out great. Some of it, less so.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I &lt;a href=&#34;https://rmoff.net/2022/04/07/hanging-up-my-boarding-passes-and-jetlagfor-now/&#34;&gt;made the decision&lt;/a&gt; during COVID to stop travelling as part of my role as a developer advocate. In this article, I talk about my experience with different areas of advocacy done remotely.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Hanging up my Boarding Passes and Jetlag…for now</title>
      <link>https://rmoff.net/2022/04/07/hanging-up-my-boarding-passes-and-jetlagfor-now/</link>
      <pubDate>Thu, 07 Apr 2022 20:58:33 +0000</pubDate>
      <guid>https://rmoff.net/2022/04/07/hanging-up-my-boarding-passes-and-jetlagfor-now/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I recently started writing an abstract for a conference later this year and realised that I’m not even sure if I want to do it. Not the conference—it’s a great one—but just the whole up on stage doing a talk thing. I can’t work out if this is just nerves from the amount of time off the stage, or something more fundamental to deal with.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Using GitHub Actions to build automagic Hugo previews of draft articles</title>
      <link>https://rmoff.net/2022/04/06/using-github-actions-to-build-automagic-hugo-previews-of-draft-articles/</link>
      <pubDate>Wed, 06 Apr 2022 19:30:13 +0000</pubDate>
      <guid>https://rmoff.net/2022/04/06/using-github-actions-to-build-automagic-hugo-previews-of-draft-articles/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This blog is written in Asciidoc, built using Hugo, and hosted on GitHub Pages. I recently wanted to share the draft of a post I was writing with someone and ended up exporting a local preview to a PDF - not a great workflow! This blog post shows you how to create an automagic hosted preview of any draft content on Hugo using GitHub Actions.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This is useful for previewing and sharing one’s own content, but also for making good use of GitHub as a collaborative platform - if someone &lt;a href=&#34;https://github.com/rmoff/rmoff-blog/pull/4#pullrequestreview-933907051&#34;&gt;reviews and amends your PR&lt;/a&gt; the post gets updated in the preview too.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🏃🚶 The unofficial Kafka Summit London 2022  Run/Walk 🏃🚶</title>
      <link>https://rmoff.net/2022/04/05/the-unofficial-kafka-summit-london-2022-run/walk/</link>
      <pubDate>Tue, 05 Apr 2022 09:54:05 +0000</pubDate>
      <guid>https://rmoff.net/2022/04/05/the-unofficial-kafka-summit-london-2022-run/walk/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;strong&gt;&lt;a href=&#34;https://www.myeventi.events/kafka22/gb/&#34;&gt;Kafka Summit London&lt;/a&gt; IS BACK&lt;/strong&gt;! After COVID spoiled everyone’s fun and fundamentally screwed everything up for the past two years, I cannot wait to be back at an in-person conference. At the last Kafka Summit in the beforetimes (San Francisco, 2019) &lt;a href=&#34;https://twitter.com/rmoff/status/1179047181891883008&#34;&gt;some of us&lt;/a&gt; got together for a &lt;a href=&#34;https://rmoff.net/2019/09/23/kafka-summit-goldengate-bridge-run/walk/&#34;&gt;run (or walk) across the GoldenGate bridge&lt;/a&gt;. I can’t promise quite the same views, but I thought it would be fun to do something similar when we meet in London later this month.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>My Favourite Tools on the Mac (Setting up a new Mac)</title>
      <link>https://rmoff.net/2021/07/29/my-favourite-tools-on-the-mac-setting-up-a-new-mac/</link>
      <pubDate>Thu, 29 Jul 2021 22:00:08 +0100</pubDate>
      <guid>https://rmoff.net/2021/07/29/my-favourite-tools-on-the-mac-setting-up-a-new-mac/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This is the software counterpart to &lt;a href=&#34;https://rmoff.net/2020/12/02/my-workstation-2020/&#34;&gt;my previous article&lt;/a&gt; in which I looked at my workstation’s hardware setup. Some of these are unique or best-of-breed, others may have been &lt;a href=&#34;https://www.economist.com/babbage/2012/07/13/youve-been-sherlocked&#34;&gt;sherlocked&lt;/a&gt; but I stick with them anyway :)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Why I use Alfred App (and maybe you should too)</title>
      <link>https://rmoff.net/2021/07/29/why-i-use-alfred-app-and-maybe-you-should-too/</link>
      <pubDate>Thu, 29 Jul 2021 21:24:08 +0100</pubDate>
      <guid>https://rmoff.net/2021/07/29/why-i-use-alfred-app-and-maybe-you-should-too/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve used &lt;a href=&#34;https://www.alfredapp.com/&#34;&gt;Alfred&lt;/a&gt; for years, and it’s one of the first apps I’ll install on a fresh Mac. It’s like the Cmd-Space search integration that MacOS has, but &lt;strong&gt;so much more than that&lt;/strong&gt;. Here’s a few of the really powerful features that makes it the first app I’ll reach for to install on any new Mac - and which it’ll feel like I’m trying to work with one arm tied behind my back if I don’t have :)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>A bash script to deploy ksqlDB queries automagically</title>
      <link>https://rmoff.net/2021/04/01/a-bash-script-to-deploy-ksqldb-queries-automagically/</link>
      <pubDate>Thu, 01 Apr 2021 23:06:22 +0100</pubDate>
      <guid>https://rmoff.net/2021/04/01/a-bash-script-to-deploy-ksqldb-queries-automagically/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;There’s &lt;a href=&#34;https://github.com/spena/ksql/blob/7bc5875896c0206574e096c0ead808b5a87caa89/design-proposals/klip-42-schema-migrations-tool.md&#34;&gt;a bunch of improvements&lt;/a&gt; in the works for how ksqlDB handles code deployments and migrations. For now though, for deploying queries there’s the option of using &lt;a href=&#34;https://docs.ksqldb.io/en/latest/operate-and-deploy/installation/server-config/#non-interactive-headless-ksqldb-usage&#34;&gt;headless mode&lt;/a&gt; (which is limited to one query file and disables subsequent interactive work on the server from a CLI), manually running commands (yuck), or using the REST endpoint to deploy queries automagically. Here’s an example of doing that.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Loading CSV data into Confluent Cloud using the FilePulse connector</title>
      <link>https://rmoff.net/2021/03/26/loading-csv-data-into-confluent-cloud-using-the-filepulse-connector/</link>
      <pubDate>Fri, 26 Mar 2021 17:25:22 +0000</pubDate>
      <guid>https://rmoff.net/2021/03/26/loading-csv-data-into-confluent-cloud-using-the-filepulse-connector/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The &lt;a href=&#34;https://www.confluent.io/hub/streamthoughts/kafka-connect-file-pulse?utm_source=rmoff&amp;amp;utm_medium=blog&amp;amp;utm_campaign=tm.devx_ch.rmoff_csv-to-ccloud.adoc&amp;amp;utm_term=rmoff-devx&#34;&gt;FilePulse connector&lt;/a&gt; from &lt;a href=&#34;https://twitter.com/fhussonnois&#34;&gt;Florian Hussonnois&lt;/a&gt; is a really useful connector for Kafka Connect which enables you to ingest flat files including CSV, JSON, XML, etc into Kafka. You can read more it in &lt;a href=&#34;https://streamthoughts.github.io/kafka-connect-file-pulse/docs/overview/filepulse/&#34;&gt;its overview here&lt;/a&gt;. Other connectors for ingested CSV data include &lt;a href=&#34;https://www.confluent.io/hub/jcustenborder/kafka-connect-spooldir?utm_source=rmoff&amp;amp;utm_medium=blog&amp;amp;utm_campaign=tm.devx_ch.rmoff_csv-to-ccloud.adoc&amp;amp;utm_term=rmoff-devx&#34;&gt;kafka-connect-spooldir&lt;/a&gt; (which I &lt;a href=&#34;https://rmoff.net/2020/06/17/loading-csv-data-into-kafka/&#34;&gt;wrote about previously&lt;/a&gt;), and &lt;a href=&#34;https://www.confluent.io/hub/mmolimar/kafka-connect-fs?utm_source=rmoff&amp;amp;utm_medium=blog&amp;amp;utm_campaign=tm.devx_ch.rmoff_csv-to-ccloud.adoc&amp;amp;utm_term=rmoff-devx&#34;&gt;kafka-connect-fs&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Here I’ll show how to use it to stream CSV data into a topic in &lt;a href=&#34;https://www.confluent.io/confluent-cloud/tryfree?utm_source=rmoff&amp;amp;utm_medium=blog&amp;amp;utm_campaign=tm.devx_ch.rmoff_csv-to-ccloud.adoc&amp;amp;utm_term=rmoff-devx&#34;&gt;Confluent Cloud&lt;/a&gt;. You can apply the same config pattern to any other secured Kafka cluster.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Connecting to managed ksqlDB in Confluent Cloud with REST and ksqlDB CLI</title>
      <link>https://rmoff.net/2021/03/24/connecting-to-managed-ksqldb-in-confluent-cloud-with-rest-and-ksqldb-cli/</link>
      <pubDate>Wed, 24 Mar 2021 09:36:43 +0000</pubDate>
      <guid>https://rmoff.net/2021/03/24/connecting-to-managed-ksqldb-in-confluent-cloud-with-rest-and-ksqldb-cli/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Using ksqlDB in &lt;a href=&#34;https://www.confluent.io/confluent-cloud/tryfree?utm_source=rmoff&amp;amp;utm_medium=blog&amp;amp;utm_campaign=tm.devx_ch.rmoff_ksqldb-local-to-cloud&amp;amp;utm_term=rmoff-devx&#34;&gt;Confluent Cloud&lt;/a&gt; makes things a whole bunch easier because now you just get to build apps and streaming pipelines, instead of having to run and manage a bunch of infrastructure yourself.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Once you’ve got ksqlDB provisioned on Confluent Cloud you can use the web-based editor to build and run queries. You can also connect to it using the &lt;a href=&#34;https://docs.ksqldb.io/en/latest/developer-guide/ksqldb-rest-api/?utm_source=rmoff&amp;amp;utm_medium=blog&amp;amp;utm_campaign=tm.devx_ch.rmoff_ksqldb-local-to-cloud&amp;amp;utm_term=rmoff-devx&#34;&gt;REST API&lt;/a&gt; and the ksqlDB CLI tool. Here’s how.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Using ksqlDB to process data ingested from ActiveMQ with Kafka Connect</title>
      <link>https://rmoff.net/2021/03/19/using-ksqldb-to-process-data-ingested-from-activemq-with-kafka-connect/</link>
      <pubDate>Fri, 19 Mar 2021 10:30:47 +0000</pubDate>
      <guid>https://rmoff.net/2021/03/19/using-ksqldb-to-process-data-ingested-from-activemq-with-kafka-connect/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The ActiveMQ source connector creates a &lt;a href=&#34;https://docs.confluent.io/kafka-connect-activemq-source/current/index.html#io-confluent-connect-jms-value&#34;&gt;Struct holding the value&lt;/a&gt; of the message from ActiveMQ (as well as its &lt;a href=&#34;https://docs.confluent.io/kafka-connect-activemq-source/current/index.html#io-confluent-connect-jms-key&#34;&gt;key&lt;/a&gt;). This is as would be expected. However, you can encounter &lt;em&gt;challenges&lt;/em&gt; in working with the data if the ActiveMQ data of interest within the payload is complex. Things like converters and schemas can get really funky, really quick.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kafka Connect JDBC Sink deep-dive: Working with Primary Keys</title>
      <link>https://rmoff.net/2021/03/12/kafka-connect-jdbc-sink-deep-dive-working-with-primary-keys/</link>
      <pubDate>Fri, 12 Mar 2021 12:16:16 +0000</pubDate>
      <guid>https://rmoff.net/2021/03/12/kafka-connect-jdbc-sink-deep-dive-working-with-primary-keys/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The Kafka Connect JDBC Sink can be used to stream data from a Kafka topic to a database such as Oracle, Postgres, MySQL, DB2, etc.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;It supports many permutations of configuration around how &lt;strong&gt;primary keys&lt;/strong&gt; are handled. The &lt;a href=&#34;https://docs.confluent.io/kafka-connect-jdbc/current/sink-connector/sink_config_options.html#data-mapping?utm_source=rmoff&amp;amp;utm_medium=blog&amp;amp;utm_campaign=tm.devx_ch.rmoff_jdbc-sink-primary-keys&amp;amp;utm_term=rmoff-devx&#34;&gt;documentation&lt;/a&gt; details these. This article aims to illustrate and expand on this.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kafka Connect - SQLSyntaxErrorException: BLOB/TEXT column … used in key specification without a key length</title>
      <link>https://rmoff.net/2021/03/11/kafka-connect-sqlsyntaxerrorexception-blob/text-column-used-in-key-specification-without-a-key-length/</link>
      <pubDate>Thu, 11 Mar 2021 11:25:57 +0000</pubDate>
      <guid>https://rmoff.net/2021/03/11/kafka-connect-sqlsyntaxerrorexception-blob/text-column-used-in-key-specification-without-a-key-length/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I got the error &lt;code&gt;SQLSyntaxErrorException: BLOB/TEXT column &amp;#39;MESSAGE_KEY&amp;#39; used in key specification without a key length&lt;/code&gt; with &lt;a href=&#34;https://docs.confluent.io/current/connect/kafka-connect-jdbc/sink-connector/index.html&#34;&gt;Kafka Connect JDBC Sink connector&lt;/a&gt; (v10.0.2) and MySQL (8.0.23)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Quick profiling of data in Apache Kafka using kafkacat and visidata</title>
      <link>https://rmoff.net/2021/03/04/quick-profiling-of-data-in-apache-kafka-using-kafkacat-and-visidata/</link>
      <pubDate>Thu, 04 Mar 2021 14:23:15 +0000</pubDate>
      <guid>https://rmoff.net/2021/03/04/quick-profiling-of-data-in-apache-kafka-using-kafkacat-and-visidata/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;ksqlDB is a fantastically powerful tool for processing and analysing streams of data in Apache Kafka. But sometimes, you just want a quick way to profile the data in a topic in Kafka. I &lt;a href=&#34;https://rmoff.net/2021/02/02/performing-a-group-by-on-data-in-bash/&#34;&gt;wrote about this previously&lt;/a&gt; with a convoluted (but effective) set of bash commands pipelined together to perform a &lt;code&gt;GROUP BY&lt;/code&gt; on data. Then someone introduced me to &lt;code&gt;visidata&lt;/code&gt;, which makes it all a lot quicker!&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Using Open Sea Map data in Kibana maps</title>
      <link>https://rmoff.net/2021/03/04/using-open-sea-map-data-in-kibana-maps/</link>
      <pubDate>Thu, 04 Mar 2021 09:23:05 +0000</pubDate>
      <guid>https://rmoff.net/2021/03/04/using-open-sea-map-data-in-kibana-maps/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Kibana’s map functionality is a powerful way to visualise data that has a location element in it. I was recently working with data about ships at sea, and whilst the built in &lt;code&gt;Road map&lt;/code&gt; is very good it doesn’t show much maritime detail.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2021/03/maps01.png&#34; alt=&#34;maps01&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Kibana’s map visualisation has the option to pull in additional visual information from other places (known as tile servers). I found &lt;a href=&#34;https://wiki.openstreetmap.org/wiki/Tile_servers&#34;&gt;a list of Tile servers&lt;/a&gt;, which had details of &lt;a href=&#34;https://wiki.openstreetmap.org/wiki/OpenSeaMap&#34;&gt;OpenSeaMap&lt;/a&gt; which includes:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Loading delimited data into Kafka - quick &amp; dirty (but effective)</title>
      <link>https://rmoff.net/2021/02/26/loading-delimited-data-into-kafka-quick-dirty-but-effective/</link>
      <pubDate>Fri, 26 Feb 2021 22:45:36 +0000</pubDate>
      <guid>https://rmoff.net/2021/02/26/loading-delimited-data-into-kafka-quick-dirty-but-effective/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Whilst Apache Kafka is an event streaming platform designed for, well, &lt;em&gt;streams&lt;/em&gt; of events, it’s perfectly valid to use it as a store of data which perhaps changes only occasionally (or even never). I’m thinking here of reference data (lookup data) that’s used to enrich regular streams of events.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;You might well get your reference data from a database where it resides and do so effectively &lt;a href=&#34;https://rmoff.dev/no-more-silos&#34;&gt;using CDC&lt;/a&gt; - but sometimes it comes down to those pesky CSV files that we all know and love/hate. Simple, awful, but effective. I wrote previously about &lt;a href=&#34;https://rmoff.net/2020/06/17/loading-csv-data-into-kafka/&#34;&gt;loading CSV data into Kafka from files that are updated frequently&lt;/a&gt;, but here I want to look at CSV files that are not changing. Kafka Connect simplifies getting data in to (and out of) Kafka but even Kafka Connect becomes a bit of an overhead when you just have a single file that you want to load into a topic and then never deal with again. I spent this afternoon wrangling with a couple of CSV-ish files, and building on my previous article about &lt;a href=&#34;https://rmoff.net/2021/02/02/performing-a-group-by-on-data-in-bash/&#34;&gt;neat tricks you can do in bash with data&lt;/a&gt;, I have some more to share with you here :)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>📼 ksqlDB HOWTO - A mini video series 📼</title>
      <link>https://rmoff.net/2021/02/17/ksqldb-howto-a-mini-video-series/</link>
      <pubDate>Wed, 17 Feb 2021 23:12:33 +0000</pubDate>
      <guid>https://rmoff.net/2021/02/17/ksqldb-howto-a-mini-video-series/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Some people learn through doing - and for that there’s a bunch of good ksqlDB tutorials &lt;a href=&#34;https://docs.ksqldb.io/en/latest/tutorials/?utm_source=rmoff&amp;amp;utm_medium=blog&amp;amp;utm_campaign=tm.devx_ch.rmoff_ksqldb-howto&amp;amp;utm_term=rmoff-devx&#34;&gt;here&lt;/a&gt; and &lt;a href=&#34;https://kafka-tutorials.confluent.io/?utm_source=rmoff&amp;amp;utm_medium=blog&amp;amp;utm_campaign=tm.devx_ch.rmoff_ksqldb-howto&amp;amp;utm_term=rmoff-devx&#34;&gt;here&lt;/a&gt;. Others may prefer to watch and listen first, before getting hands on. And for that, I humbly offer you this little series of videos all about ksqlDB. They’re all based on a set of demo scripts that you can &lt;a href=&#34;https://github.com/confluentinc/demo-scene/blob/master/introduction-to-ksqldb/demo_introduction_to_ksqldb_02.adoc&#34;&gt;run for yourself and try out&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;🚨 Make sure you &lt;a href=&#34;http://youtube.com/rmoff?sub_confirmation=1&#34;&gt;subscribe to my YouTube channel&lt;/a&gt; so that you don’t miss more videos like these!&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Performing a GROUP BY on data in bash</title>
      <link>https://rmoff.net/2021/02/02/performing-a-group-by-on-data-in-bash/</link>
      <pubDate>Tue, 02 Feb 2021 17:23:21 +0000</pubDate>
      <guid>https://rmoff.net/2021/02/02/performing-a-group-by-on-data-in-bash/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;One of the fun things about working with data over the years is learning how to use the tools of the day—but also learning to fall back on the tools that are always there for you - and one of those is bash and its wonderful library of shell tools.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;There’s an even better way than I’ve described here, and it’s called &lt;code&gt;visidata&lt;/code&gt;. &lt;a href=&#34;https://rmoff.net/2021/03/04/quick-profiling-of-data-in-apache-kafka-using-kafkacat-and-visidata/&#34;&gt;I’ve written about it more over here&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve been playing around with a new data source recently, and needed to understand more about its structure. Within a single stream there were multiple message types.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Running as root on Docker images that don&#39;t use root</title>
      <link>https://rmoff.net/2021/01/13/running-as-root-on-docker-images-that-dont-use-root/</link>
      <pubDate>Wed, 13 Jan 2021 12:13:41 +0000</pubDate>
      <guid>https://rmoff.net/2021/01/13/running-as-root-on-docker-images-that-dont-use-root/</guid>
      <description>&lt;p&gt;tl;dr: specify the &lt;code&gt;--user root&lt;/code&gt; argument:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-shell&#34; data-lang=&#34;shell&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;docker &lt;span style=&#34;color:#008000&#34;&gt;exec&lt;/span&gt; --interactive &lt;span style=&#34;color:#b62;font-weight:bold&#34;&gt;\&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;            --tty &lt;span style=&#34;color:#b62;font-weight:bold&#34;&gt;\&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;            --user root &lt;span style=&#34;color:#b62;font-weight:bold&#34;&gt;\&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;            --workdir / &lt;span style=&#34;color:#b62;font-weight:bold&#34;&gt;\&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;            container-name bash&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Running a self-managed Kafka Connect worker for Confluent Cloud</title>
      <link>https://rmoff.net/2021/01/11/running-a-self-managed-kafka-connect-worker-for-confluent-cloud/</link>
      <pubDate>Mon, 11 Jan 2021 17:02:03 +0000</pubDate>
      <guid>https://rmoff.net/2021/01/11/running-a-self-managed-kafka-connect-worker-for-confluent-cloud/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Confluent Cloud is not only a &lt;strong&gt;fully&lt;/strong&gt;-managed Apache Kafka service, but also provides important additional pieces for building applications and pipelines including &lt;a href=&#34;https://docs.confluent.io/cloud/current/connectors/index.html&#34;&gt;managed connectors&lt;/a&gt;, &lt;a href=&#34;https://docs.confluent.io/cloud/current/client-apps/schemas-manage.html&#34;&gt;Schema Registry&lt;/a&gt;, and &lt;a href=&#34;https://docs.confluent.io/cloud/current/ksqldb.html&#34;&gt;ksqlDB&lt;/a&gt;. Managed Connectors are run for you (hence, managed!) within Confluent Cloud - you just specify the technology to which you want to integrate in or out of Kafka and Confluent Cloud does the rest.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Creating topics with Kafka Connect</title>
      <link>https://rmoff.net/2021/01/06/creating-topics-with-kafka-connect/</link>
      <pubDate>Wed, 06 Jan 2021 12:18:51 +0000</pubDate>
      <guid>https://rmoff.net/2021/01/06/creating-topics-with-kafka-connect/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;When Kafka Connect ingests data from a source system into Kafka it writes it to a topic. If you have set &lt;code&gt;auto.create.topics.enable = true&lt;/code&gt; on your broker then the topic will be created when written to. If &lt;code&gt;auto.create.topics.enable = false&lt;/code&gt; (as it is on Confluent Cloud and many self-managed environments, for good reasons) then you can tell Kafka Connect to create those topics first. &lt;em&gt;This was added in Apache Kafka 2.6 (Confluent Platform 6.0) - prior to that you had to manually create the topics yourself otherwise the connector would fail.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kafka Connect - Deep Dive into Single Message Transforms</title>
      <link>https://rmoff.net/2021/01/04/kafka-connect-deep-dive-into-single-message-transforms/</link>
      <pubDate>Mon, 04 Jan 2021 14:26:40 +0000</pubDate>
      <guid>https://rmoff.net/2021/01/04/kafka-connect-deep-dive-into-single-message-transforms/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://cwiki.apache.org/confluence/display/KAFKA/KIP-66%3A+Single+Message+Transforms+for+Kafka+Connect&#34;&gt;KIP-66&lt;/a&gt; was added in Apache Kafka 0.10.2 and brought new functionality called &lt;strong&gt;Single Message Transforms&lt;/strong&gt; (SMT). Using SMT you can modify the data and its characteristics as it passes through Kafka Connect pipeline, without needing additional stream processors. For things like manipulating fields, changing topic names, conditionally dropping messages, and more, SMT are a perfect solution. If you get to things like aggregation, joining streams, and lookups then SMT may not be the best for you and you should head over to Kafka Streams or ksqlDB instead.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 12: Community Transformations</title>
      <link>https://rmoff.net/2020/12/23/twelve-days-of-smt-day-12-community-transformations/</link>
      <pubDate>Wed, 23 Dec 2020 16:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/23/twelve-days-of-smt-day-12-community-transformations/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Apache Kafka ships with &lt;a href=&#34;https://kafka.apache.org/documentation/#connect_included_transformation&#34;&gt;many Single Message Transformations (SMT) included&lt;/a&gt; - but the great thing about it being an &lt;a href=&#34;https://kafka.apache.org/26/javadoc/org/apache/kafka/connect/transforms/Transformation.html&#34;&gt;open API&lt;/a&gt; is that people can, and do, write their own transformations. Many of these are shared with the wider community, and in this final installment of the series I’m going to look at some of the transformations written by Jeremy Custenborder and available in &lt;a href=&#34;https://jcustenborder.github.io/kafka-connect-documentation/projects/kafka-connect-transform-common&#34;&gt;&lt;code&gt;kafka-connect-transform-common&lt;/code&gt;&lt;/a&gt; which can be &lt;a href=&#34;https://www.confluent.io/hub/jcustenborder/kafka-connect-transform-common&#34;&gt;downloaded and installed from Confluent Hub&lt;/a&gt; (or built from &lt;a href=&#34;https://github.com/jcustenborder/kafka-connect-transform-common&#34;&gt;source&lt;/a&gt;, if you like that kind of thing). Also check out the XML transformation by the same author, which &lt;a href=&#34;https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-option-2-kafka-connect-plus-single-message-transform/&#34;&gt;I’ve written about previously&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 11: Predicate and Filter</title>
      <link>https://rmoff.net/2020/12/22/twelve-days-of-smt-day-11-predicate-and-filter/</link>
      <pubDate>Tue, 22 Dec 2020 16:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/22/twelve-days-of-smt-day-11-predicate-and-filter/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Apache Kafka 2.6 included &lt;a href=&#34;https://cwiki.apache.org/confluence/display/KAFKA/KIP-585%3A+Filter+and+Conditional+SMTs&#34;&gt;KIP-585&lt;/a&gt; which adds support for defining predicates against which transforms are conditionally executed, as well as a &lt;a href=&#34;https://docs.confluent.io/platform/current/connect/transforms/filter-ak.html&#34;&gt;&lt;code&gt;Filter&lt;/code&gt;&lt;/a&gt; Single Message Transform to drop messages - which in combination means that you can conditionally drop messages.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;As part of Apache Kafka, Kafka Connect ships with pre-built Single Message Transforms and Predicates, but you can also write you own. The API for each is documented: &lt;a href=&#34;https://kafka.apache.org/26/javadoc/org/apache/kafka/connect/transforms/Transformation.html&#34;&gt;&lt;code&gt;Transformation&lt;/code&gt;&lt;/a&gt; / &lt;a href=&#34;https://kafka.apache.org/26/javadoc/index.html?org/apache/kafka/connect/transforms/predicates/Predicate.html&#34;&gt;&lt;code&gt;Predicate&lt;/code&gt;&lt;/a&gt;. The predicates that ship with Apache Kafka are:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;code&gt;RecordIsTombstone&lt;/code&gt; - The value part of the message is null (denoting a tombstone message)&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;code&gt;HasHeaderKey&lt;/code&gt;- Matches if a header exists with the name given&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;code&gt;TopicNameMatches&lt;/code&gt; - Matches based on topic&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 10: ReplaceField</title>
      <link>https://rmoff.net/2020/12/21/twelve-days-of-smt-day-10-replacefield/</link>
      <pubDate>Mon, 21 Dec 2020 16:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/21/twelve-days-of-smt-day-10-replacefield/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The &lt;a href=&#34;https://docs.confluent.io/platform/current/connect/transforms/replacefield.html&#34;&gt;&lt;code&gt;ReplaceField&lt;/code&gt;&lt;/a&gt; Single Message Transform has three modes of operation on fields of data passing through Kafka Connect:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Include &lt;strong&gt;only&lt;/strong&gt; the fields specified in the list (&lt;code&gt;include&lt;/code&gt;)&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Include all fields &lt;strong&gt;except&lt;/strong&gt; the ones specified (&lt;code&gt;exclude&lt;/code&gt;)&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Rename field(s) (&lt;code&gt;renames&lt;/code&gt;)&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Scheduling Hugo Builds on GitHub pages with GitHub Actions</title>
      <link>https://rmoff.net/2020/12/20/scheduling-hugo-builds-on-github-pages-with-github-actions/</link>
      <pubDate>Sun, 20 Dec 2020 23:45:03 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/20/scheduling-hugo-builds-on-github-pages-with-github-actions/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Over the years I’ve used various blogging platforms; after a brief dalliance with Blogger I started for real with the near-inevitable Wordpress.com. From there I decided it would be fun to self-host using Ghost, and then almost &lt;a href=&#34;https://rmoff.net/2018/12/17/moving-from-ghost-to-hugo/&#34;&gt;exactly two years ago to the day&lt;/a&gt; decided it definitely was not fun to spend time patching and upgrading my blog platform instead of writing blog articles, so headed over to my current platform of choice: Hugo hosted on GitHub pages. This has worked extremely well for me during that time, doing everything I want from it until recently.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 9: Cast</title>
      <link>https://rmoff.net/2020/12/18/twelve-days-of-smt-day-9-cast/</link>
      <pubDate>Fri, 18 Dec 2020 16:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/18/twelve-days-of-smt-day-9-cast/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The &lt;a href=&#34;https://docs.confluent.io/platform/current/connect/transforms/cast.html&#34;&gt;&lt;code&gt;Cast&lt;/code&gt;&lt;/a&gt; Single Message Transform lets you change the data type of fields in a Kafka message, supporting numerics, string, and boolean.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 8: TimestampConverter</title>
      <link>https://rmoff.net/2020/12/17/twelve-days-of-smt-day-8-timestampconverter/</link>
      <pubDate>Thu, 17 Dec 2020 16:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/17/twelve-days-of-smt-day-8-timestampconverter/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The &lt;a href=&#34;https://docs.confluent.io/platform/current/connect/transforms/timestampconverter.html&#34;&gt;&lt;code&gt;TimestampConverter&lt;/code&gt;&lt;/a&gt; Single Message Transform lets you work with timestamp fields in Kafka messages. You can convert a string into a native &lt;a href=&#34;https://kafka.apache.org/26/javadoc/org/apache/kafka/connect/data/Timestamp.html&#34;&gt;Timestamp&lt;/a&gt; type (or &lt;a href=&#34;https://kafka.apache.org/26/javadoc/org/apache/kafka/connect/data/Date.html&#34;&gt;Date&lt;/a&gt; or &lt;a href=&#34;https://kafka.apache.org/26/javadoc/org/apache/kafka/connect/data/Time.html&#34;&gt;Time&lt;/a&gt;), as well as Unix epoch - and the same in reverse too.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This is really useful to make sure that data ingested into Kafka is correctly stored as a Timestamp (if it is one), and also enables you to write a Timestamp out to a sink connector in a string format that you choose.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 7: TimestampRouter</title>
      <link>https://rmoff.net/2020/12/16/twelve-days-of-smt-day-7-timestamprouter/</link>
      <pubDate>Wed, 16 Dec 2020 16:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/16/twelve-days-of-smt-day-7-timestamprouter/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Just like the &lt;a href=&#34;https://rmoff.net/2020/12/11/twelve-days-of-smt-day-4-regexrouter/&#34;&gt;&lt;code&gt;RegExRouter&lt;/code&gt;&lt;/a&gt;, the &lt;a href=&#34;https://docs.confluent.io/platform/current/connect/transforms/timestamprouter.html&#34;&gt;&lt;code&gt;TimeStampRouter&lt;/code&gt;&lt;/a&gt; can be used to modify the topic name of messages as they pass through Kafka Connect. Since the topic name is usually the basis for the naming of the object to which messages are written in a sink connector, this is a great way to achieve time-based partitioning of those objects if required. For example, instead of streaming messages from Kafka to an Elasticsearch index called &lt;code&gt;cars&lt;/code&gt;, they can be routed to monthly indices e.g. &lt;code&gt;cars_2020-10&lt;/code&gt;, &lt;code&gt;cars_2020-11&lt;/code&gt;, &lt;code&gt;cars_2020-12&lt;/code&gt;, etc.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The &lt;code&gt;TimeStampRouter&lt;/code&gt; takes two arguments; the format of the final topic name to generate, and the format of the timestamp to put in the topic name (based on &lt;a href=&#34;https://docs.oracle.com/javase/8/docs/api/java/text/SimpleDateFormat.html&#34;&gt;&lt;code&gt;SimpleDateFormat&lt;/code&gt;&lt;/a&gt;).&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;listingblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;pre class=&#34;rouge highlight&#34; style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;&lt;code data-lang=&#34;javascript&#34;&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;                                     &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;addTimestampToTopic&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.addTimestampToTopic.type&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;            &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;org.apache.kafka.connect.transforms.TimestampRouter&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.addTimestampToTopic.topic.format&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;    &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;${topic}_${timestamp}&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.addTimestampToTopic.timestamp.format&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;YYYY-MM-dd&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 6: InsertField II</title>
      <link>https://rmoff.net/2020/12/15/twelve-days-of-smt-day-6-insertfield-ii/</link>
      <pubDate>Tue, 15 Dec 2020 16:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/15/twelve-days-of-smt-day-6-insertfield-ii/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;We kicked off this series by seeing on &lt;a href=&#34;https://rmoff.net/2020/12/08/twelve-days-of-smt-day-1-insertfield-timestamp/&#34;&gt;day 1&lt;/a&gt; how to use &lt;a href=&#34;https://docs.confluent.io/platform/current/connect/transforms/insertfield.html&#34;&gt;&lt;code&gt;InsertField&lt;/code&gt;&lt;/a&gt; to add in the timestamp to a message passing through the Kafka Connect sink connector. Today we’ll see how to use the same Single Message Transform to add in a static field value, as well as the name of the Kafka topic, partition, and offset from which the message has been read.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;listingblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;pre class=&#34;rouge highlight&#34; style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;&lt;code data-lang=&#34;javascript&#34;&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;                                &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;insertStaticField1&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.insertStaticField1.type&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;        &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;org.apache.kafka.connect.transforms.InsertField$Value&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.insertStaticField1.static.field&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;sourceSystem&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.insertStaticField1.static.value&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;NeverGonna&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 5: MaskField</title>
      <link>https://rmoff.net/2020/12/14/twelve-days-of-smt-day-5-maskfield/</link>
      <pubDate>Mon, 14 Dec 2020 16:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/14/twelve-days-of-smt-day-5-maskfield/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;If you want to mask fields of data as you ingest from a source into Kafka, or write to a sink from Kafka with Kafka Connect, the &lt;a href=&#34;https://docs.confluent.io/platform/current/connect/transforms/maskfield.html&#34;&gt;&lt;code&gt;MaskField&lt;/code&gt;&lt;/a&gt; Single Message Transform is perfect for you. It retains the fields whilst replacing its value.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;To use the Single Message Transform you specify the field to mask, and its replacement value. To mask the contents of a field called &lt;code&gt;cc_num&lt;/code&gt; you would use:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;listingblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;pre class=&#34;rouge highlight&#34; style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;&lt;code data-lang=&#34;javascript&#34;&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;                               &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;maskCC&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.maskCC.type&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;                   &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;org.apache.kafka.connect.transforms.MaskField$Value&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.maskCC.fields&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;                 &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;cc_num&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.maskCC.replacement&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;            &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;****-****-****-****&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 4: RegExRouter</title>
      <link>https://rmoff.net/2020/12/11/twelve-days-of-smt-day-4-regexrouter/</link>
      <pubDate>Fri, 11 Dec 2020 16:40:18 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/11/twelve-days-of-smt-day-4-regexrouter/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;If you want to change the topic name to which a source connector writes, or object name that’s created on a target by a sink connector, the &lt;a href=&#34;https://docs.confluent.io/platform/current/connect/transforms/regexrouter.html&#34;&gt;&lt;code&gt;RegExRouter&lt;/code&gt;&lt;/a&gt; is exactly what you need.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;To use the Single Message Transform you specify the pattern in the topic name to match, and its replacement. To drop a prefix of &lt;code&gt;test-&lt;/code&gt; from a topic you would use:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;listingblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;pre class=&#34;rouge highlight&#34; style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;&lt;code data-lang=&#34;javascript&#34;&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;                             &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;dropTopicPrefix&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.dropTopicPrefix.type&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;        &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;org.apache.kafka.connect.transforms.RegexRouter&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.dropTopicPrefix.regex&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;       &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;test-(.*)&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.dropTopicPrefix.replacement&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt; &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;$1&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 3: Flatten</title>
      <link>https://rmoff.net/2020/12/10/twelve-days-of-smt-day-3-flatten/</link>
      <pubDate>Thu, 10 Dec 2020 16:25:00 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/10/twelve-days-of-smt-day-3-flatten/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The &lt;a href=&#34;https://docs.confluent.io/platform/current/connect/transforms/flatten.html&#34;&gt;&lt;code&gt;Flatten&lt;/code&gt;&lt;/a&gt; Single Message Transform (SMT) is useful when you need to collapse a nested message down to a flat structure.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;To use the Single Message Transform you only need to reference it; there’s no additional configuration required:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;listingblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;pre class=&#34;rouge highlight&#34; style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;&lt;code data-lang=&#34;javascript&#34;&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;                    &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;flatten&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.flatten.type&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;       &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;org.apache.kafka.connect.transforms.Flatten$Value&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 2: ValueToKey and ExtractField</title>
      <link>https://rmoff.net/2020/12/09/twelve-days-of-smt-day-2-valuetokey-and-extractfield/</link>
      <pubDate>Wed, 09 Dec 2020 20:00:18 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/09/twelve-days-of-smt-day-2-valuetokey-and-extractfield/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Setting the key of a Kafka message is important as it ensures correct logical processing when consumed across multiple partitions, as well as being a requirement when joining to messages in other topics. When using Kafka Connect the connector may already set the key, which is great. If not, you can use these two Single Message Transforms (SMT) to set it as part of the pipeline based on a field in the value part of the message.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;To use the &lt;a href=&#34;https://docs.confluent.io/platform/current/connect/transforms/valuetokey.html&#34;&gt;&lt;code&gt;ValueToKey&lt;/code&gt;&lt;/a&gt; Single Message Transform specify the name of the field (&lt;code&gt;id&lt;/code&gt;) that you want to copy from the value to the key:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;listingblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;pre class=&#34;rouge highlight&#34; style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;&lt;code data-lang=&#34;javascript&#34;&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;                    &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;copyIdToKey&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.copyIdToKey.type&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;   &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;org.apache.kafka.connect.transforms.ValueToKey&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.copyIdToKey.fields&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt; &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;id&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🎄 Twelve Days of SMT 🎄 - Day 1: InsertField (timestamp)</title>
      <link>https://rmoff.net/2020/12/08/twelve-days-of-smt-day-1-insertfield-timestamp/</link>
      <pubDate>Tue, 08 Dec 2020 22:23:18 +0000</pubDate>
      <guid>https://rmoff.net/2020/12/08/twelve-days-of-smt-day-1-insertfield-timestamp/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;You can use the &lt;a href=&#34;https://docs.confluent.io/platform/current/connect/transforms/insertfield.html&#34;&gt;&lt;code&gt;InsertField&lt;/code&gt;&lt;/a&gt; Single Message Transform (SMT) to add the message timestamp into each message that Kafka Connect sends to a sink.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;To use the Single Message Transform specify the name of the field (&lt;code&gt;timestamp.field&lt;/code&gt;) that you want to add to hold the message timestamp:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;listingblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;pre class=&#34;rouge highlight&#34; style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;&lt;code data-lang=&#34;javascript&#34;&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;                         &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;insertTS&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.insertTS.type&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;           &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;org.apache.kafka.connect.transforms.InsertField$Value&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;transforms.insertTS.timestamp.field&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;messageTS&lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Life as a Developer Advocate, nine months into a pandemic</title>
      <link>https://rmoff.net/2020/12/03/life-as-a-developer-advocate-nine-months-into-a-pandemic/</link>
      <pubDate>Thu, 03 Dec 2020 22:15:59 +0100</pubDate>
      <guid>https://rmoff.net/2020/12/03/life-as-a-developer-advocate-nine-months-into-a-pandemic/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Back in March 2020 the western world came to somewhat of a juddering halt, thanks to COVID-19. No-one knew then what would happen, but there was the impression that whilst the next few months were a write-off for sure, &lt;em&gt;maybe&lt;/em&gt; things would pick up again later in the year.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;It’s now early December 2020, and nothing is picking up any time soon. Summer provided a respite from the high levels of infection and mortality (in the UK at least), but then numbers spiked again in many places around the world and what was punted down the river back in March is being firmly punted yet again now.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>My Workstation - 2020</title>
      <link>https://rmoff.net/2020/12/02/my-workstation-2020/</link>
      <pubDate>Wed, 02 Dec 2020 17:19:21 +0100</pubDate>
      <guid>https://rmoff.net/2020/12/02/my-workstation-2020/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;Is a blog even a blog nowadays if it doesn’t include a &amp;#34;Here is my home office setup&amp;#34;?&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Thanks to conferences all being online, and thus my talks being delivered from my study—and &lt;a href=&#34;https://twitter.com/search?q=speakerselfie%20(from%3Armoff)&amp;amp;src=typed_query&amp;amp;f=live&#34;&gt;my habit of posting a #SpeakerSelfie&lt;/a&gt; each time I do a conference talk—I often get questions about my setup. Plus, I’m kinda pleased with it so I want to show it off too ;-)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Keynote - Why is Replace Fonts greyed out?</title>
      <link>https://rmoff.net/2020/11/13/keynote-why-is-replace-fonts-greyed-out/</link>
      <pubDate>Fri, 13 Nov 2020 15:49:37 +0000</pubDate>
      <guid>https://rmoff.net/2020/11/13/keynote-why-is-replace-fonts-greyed-out/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Very short &amp;amp; sweet this post, but Google turned up nothing when I was stuck so hopefully I’ll save someone else some head scratching by sharing this.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2020/11/keynote01.jpg&#34; alt=&#34;keynote01&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kafka Connect, ksqlDB, and Kafka Tombstone messages</title>
      <link>https://rmoff.net/2020/11/03/kafka-connect-ksqldb-and-kafka-tombstone-messages/</link>
      <pubDate>Tue, 03 Nov 2020 17:14:33 +0000</pubDate>
      <guid>https://rmoff.net/2020/11/03/kafka-connect-ksqldb-and-kafka-tombstone-messages/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;As you may already realise, Kafka is not just a fancy message bus, or a pipe for big data. It’s an event streaming platform! If this is news to you, I’ll wait here whilst you &lt;a href=&#34;https://www.confluent.io/learn/kafka-tutorial/&#34;&gt;read this&lt;/a&gt; or &lt;a href=&#34;https://rmoff.dev/kafka101&#34;&gt;watch this&lt;/a&gt;…&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Streaming Geopoint data from Kafka to Elasticsearch</title>
      <link>https://rmoff.net/2020/11/03/streaming-geopoint-data-from-kafka-to-elasticsearch/</link>
      <pubDate>Tue, 03 Nov 2020 10:36:18 +0000</pubDate>
      <guid>https://rmoff.net/2020/11/03/streaming-geopoint-data-from-kafka-to-elasticsearch/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Streaming data from Kafka to Elasticsearch is easy with Kafka Connect - you can see how in this &lt;a href=&#34;https://rmoff.dev/kafka-elasticsearch&#34;&gt;tutorial&lt;/a&gt; and &lt;a href=&#34;https://rmoff.dev/kafka-elasticsearch-video&#34;&gt;video&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;One of the things that sometimes causes issues though is how to get location data correctly indexed into Elasticsearch as &lt;a href=&#34;https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html&#34;&gt;&lt;code&gt;geo_point&lt;/code&gt;&lt;/a&gt; fields to enable all that lovely location analysis. Unlike data types like dates and numerics, Elasticsearch’s &lt;a href=&#34;https://www.elastic.co/guide/en/elasticsearch/reference/current/dynamic-field-mapping.html&#34;&gt;Dynamic Field Mapping&lt;/a&gt; won’t automagically pick up &lt;code&gt;geo_point&lt;/code&gt; data, and so you have to do two things:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>ksqlDB - How to model a variable number of fields in a nested value (`STRUCT`)</title>
      <link>https://rmoff.net/2020/10/07/ksqldb-how-to-model-a-variable-number-of-fields-in-a-nested-value-struct/</link>
      <pubDate>Wed, 07 Oct 2020 11:44:51 +0100</pubDate>
      <guid>https://rmoff.net/2020/10/07/ksqldb-how-to-model-a-variable-number-of-fields-in-a-nested-value-struct/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;There was a &lt;a href=&#34;https://stackoverflow.com/questions/64241285/kafka-topic-with-variable-nested-json-object-as-ksql-db-stream/64242383#64242383&#34;&gt;good question on StackOverflow&lt;/a&gt; recently in which someone was struggling to find the appropriate ksqlDB DDL to model a source topic in which there was a variable number of fields in a &lt;code&gt;STRUCT&lt;/code&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Streaming XML messages from IBM MQ into Kafka into MongoDB</title>
      <link>https://rmoff.net/2020/10/05/streaming-xml-messages-from-ibm-mq-into-kafka-into-mongodb/</link>
      <pubDate>Mon, 05 Oct 2020 10:09:41 +0100</pubDate>
      <guid>https://rmoff.net/2020/10/05/streaming-xml-messages-from-ibm-mq-into-kafka-into-mongodb/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Let’s imagine we have XML data on a queue in IBM MQ, and we want to ingest it into Kafka to then use downstream, perhaps in an application or maybe to stream to a NoSQL store like MongoDB.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This same pattern for ingesting XML will work with other connectors such as &lt;a href=&#34;https://www.confluent.io/hub/confluentinc/kafka-connect-jms&#34;&gt;JMS&lt;/a&gt; and &lt;a href=&#34;https://www.confluent.io/hub/confluentinc/kafka-connect-activemq&#34;&gt;ActiveMQ&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Ingesting XML data into Kafka - Option 3: Kafka Connect FilePulse connector</title>
      <link>https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-option-3-kafka-connect-filepulse-connector/</link>
      <pubDate>Thu, 01 Oct 2020 15:09:41 +0100</pubDate>
      <guid>https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-option-3-kafka-connect-filepulse-connector/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;👉 &lt;em&gt;&lt;a href=&#34;https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-introduction/&#34;&gt;Ingesting XML data into Kafka - Introduction&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;We saw in the &lt;a href=&#34;https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-option-1-the-dirty-hack/&#34;&gt;first post&lt;/a&gt; how to hack together an ingestion pipeline for XML into Kafka using a source such as &lt;code&gt;curl&lt;/code&gt; piped through &lt;code&gt;xq&lt;/code&gt; to wrangle the XML and stream it into Kafka using &lt;code&gt;kafkacat&lt;/code&gt;, optionally using ksqlDB to apply and register a schema for it.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The &lt;a href=&#34;https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-option-2-kafka-connect-plus-single-message-transform/&#34;&gt;second one&lt;/a&gt; showed the use of any Kafka Connect source connector plus the &lt;code&gt;kafka-connect-transform-xml&lt;/code&gt; Single Message Transformation. Now we’re going to take a look at a source connector from the community that can also be used to ingest XML data into Kafka.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Ingesting XML data into Kafka - Option 2: Kafka Connect plus Single Message Transform</title>
      <link>https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-option-2-kafka-connect-plus-single-message-transform/</link>
      <pubDate>Thu, 01 Oct 2020 14:09:41 +0100</pubDate>
      <guid>https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-option-2-kafka-connect-plus-single-message-transform/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;We previously looked at the background to &lt;a href=&#34;https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-introduction/&#34;&gt;getting XML into Kafka&lt;/a&gt;, and potentially &lt;a href=&#34;https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-option-1-the-dirty-hack/&#34;&gt;how [not] to do it&lt;/a&gt;. Now let’s look at the &lt;em&gt;proper&lt;/em&gt; way to build a streaming ingestion pipeline for XML into Kafka, using Kafka Connect.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;If you’re unfamiliar with Kafka Connect, check out this &lt;a href=&#34;https://rmoff.dev/what-is-kafka-connect&#34;&gt;quick intro to Kafka Connect here&lt;/a&gt;. Kafka Connect’s excellent plugable architecture means that we can pair any &lt;strong&gt;source connector&lt;/strong&gt; to read XML from wherever we have it (for example, a flat file, or a MQ, or anywhere else), with a &lt;strong&gt;Single Message Transform&lt;/strong&gt; to transform the XML into a payload with a schema, and finally a &lt;strong&gt;converter&lt;/strong&gt; to serialise the data in a form that we would like to use such as Avro or Protobuf.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Ingesting XML data into Kafka - Option 1: The Dirty Hack</title>
      <link>https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-option-1-the-dirty-hack/</link>
      <pubDate>Thu, 01 Oct 2020 13:09:41 +0100</pubDate>
      <guid>https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-option-1-the-dirty-hack/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;👉 &lt;a href=&#34;https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-introduction/&#34;&gt;Ingesting XML data into Kafka - Introduction&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;strong&gt;What would a blog post on &lt;code&gt;rmoff.net&lt;/code&gt; be if it didn’t include the dirty hack option? 😁&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;The secret to dirty hacks is that they are often rather effective and when needs must, they can suffice. If you’re prototyping and need to &lt;a href=&#34;https://www.urbandictionary.com/define.php?term=JFDI&#34;&gt;&lt;strong&gt;JFDI&lt;/strong&gt;&lt;/a&gt;, a dirty hack is just fine. If you’re looking for code to run in Production, then a dirty hack probably is not fine.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Ingesting XML data into Kafka - Introduction</title>
      <link>https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-introduction/</link>
      <pubDate>Thu, 01 Oct 2020 12:09:41 +0100</pubDate>
      <guid>https://rmoff.net/2020/10/01/ingesting-xml-data-into-kafka-introduction/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;XML has been around for 20+ years, and whilst other ways of serialising our data have gained popularity in more recent times (such as JSON, Avro, and Protobuf), XML is not going away soon. Part of that is down to technical reasons (clearly defined and documented schemas), and part of it is simply down to enterprise inertia - having adopted XML for systems in the last couple of decades, they’re not going to be changing now just for some short-term fad. See also COBOL.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>`abcde` - Error trying to calculate disc ids without lead-out information</title>
      <link>https://rmoff.net/2020/10/01/abcde-error-trying-to-calculate-disc-ids-without-lead-out-information/</link>
      <pubDate>Thu, 01 Oct 2020 09:16:11 +0100</pubDate>
      <guid>https://rmoff.net/2020/10/01/abcde-error-trying-to-calculate-disc-ids-without-lead-out-information/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Short &amp;amp; sweet to help out future Googlers. Trying to use &lt;code&gt;abcde&lt;/code&gt; I got the error:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;listingblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;pre class=&#34;rouge highlight&#34; style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;&lt;code data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;color: #f92672;font-weight: bold&#34;&gt;[&lt;/span&gt;WARNING] something went wrong &lt;span style=&#34;color: #66d9ef;font-weight: bold&#34;&gt;while &lt;/span&gt;querying the CD... Maybe a DATA CD or the CD is not loaded?&#xA;&lt;span style=&#34;color: #f92672;font-weight: bold&#34;&gt;[&lt;/span&gt;WARNING] Error trying to calculate disc ids without lead-out information.&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>IBM MQ on Docker - Channel was blocked</title>
      <link>https://rmoff.net/2020/10/01/ibm-mq-on-docker-channel-was-blocked/</link>
      <pubDate>Thu, 01 Oct 2020 01:09:41 +0100</pubDate>
      <guid>https://rmoff.net/2020/10/01/ibm-mq-on-docker-channel-was-blocked/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Running IBM MQ in a Docker container and the client connecting to it was throwing repeated &lt;code&gt;Channel was blocked&lt;/code&gt; errors.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Setting key value when piping from jq to kafkacat</title>
      <link>https://rmoff.net/2020/09/30/setting-key-value-when-piping-from-jq-to-kafkacat/</link>
      <pubDate>Wed, 30 Sep 2020 20:54:09 +0100</pubDate>
      <guid>https://rmoff.net/2020/09/30/setting-key-value-when-piping-from-jq-to-kafkacat/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;One of my favourite hacks for getting data into Kafka is using kafkacat and &lt;code&gt;stdin&lt;/code&gt;, often from &lt;code&gt;jq&lt;/code&gt;. You can see this in action with &lt;a href=&#34;https://rmoff.net/2020/03/11/streaming-wi-fi-trace-data-from-raspberry-pi-to-apache-kafka-with-confluent-cloud/&#34;&gt;Wi-Fi data&lt;/a&gt;, &lt;a href=&#34;https://rmoff.net/2020/01/21/monitoring-sonos-with-ksqldb-influxdb-and-grafana/&#34;&gt;IoT data&lt;/a&gt;, and data from a &lt;a href=&#34;https://rmoff.net/2018/05/10/quick-n-easy-population-of-realistic-test-data-into-kafka/&#34;&gt;REST endpoint&lt;/a&gt;. This is fine for getting values into a Kafka message - but Kafka messages are &lt;strong&gt;key&lt;/strong&gt;/value, and being able to specify a key is can often be important.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Here’s a way to do that, using a separator and some &lt;code&gt;jq&lt;/code&gt; magic. Note that at the moment &lt;a href=&#34;https://github.com/edenhill/kafkacat/issues/140&#34;&gt;kafkacat only supports single byte separator characters&lt;/a&gt;, so you need to choose carefully. If you pick a separator that also appears in your data, it’s possibly going to have unintended consequences.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Some of my favourite public data sets</title>
      <link>https://rmoff.net/2020/09/25/some-of-my-favourite-public-data-sets/</link>
      <pubDate>Fri, 25 Sep 2020 12:09:41 +0100</pubDate>
      <guid>https://rmoff.net/2020/09/25/some-of-my-favourite-public-data-sets/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Readers of a certain age and RDBMS background will probably remember &lt;code&gt;northwind&lt;/code&gt;, or &lt;code&gt;HR&lt;/code&gt;, or &lt;code&gt;OE&lt;/code&gt; databases - or quite possibly not just remember them but still be using them. Hardcoded sample data is fine, and it’s great for repeatable tutorials and examples - but it’s boring as heck if you want to build an example with something that isn’t using the same data set for the 100th time.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>📌    🎁 A collection of Kafka-related talks 💝</title>
      <link>https://rmoff.net/2020/09/23/a-collection-of-kafka-related-talks/</link>
      <pubDate>Wed, 23 Sep 2020 15:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/09/23/a-collection-of-kafka-related-talks/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Here’s a collection of Kafka-related talks, &lt;em&gt;just for you.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Each one has 🍿🎥 a recording, 📔 slides, and 👾 code to go and try out. &lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Using the Debezium MS SQL connector with ksqlDB embedded Kafka Connect</title>
      <link>https://rmoff.net/2020/09/18/using-the-debezium-ms-sql-connector-with-ksqldb-embedded-kafka-connect/</link>
      <pubDate>Fri, 18 Sep 2020 10:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/09/18/using-the-debezium-ms-sql-connector-with-ksqldb-embedded-kafka-connect/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Prompted by &lt;a href=&#34;https://stackoverflow.com/questions/63946368/how-to-use-the-debezium-sql-server-connector-with-ksqldb-embedded-connect&#34;&gt;a question on StackOverflow&lt;/a&gt; I thought I’d take a quick look at setting up &lt;a href=&#34;https://ksqldb.io&#34;&gt;ksqlDB&lt;/a&gt; to ingest CDC events from Microsoft SQL Server using &lt;a href=&#34;https://debezium.io/&#34;&gt;Debezium&lt;/a&gt;. Some of this is based on my previous article, &lt;a href=&#34;https://rmoff.net/2019/11/20/streaming-data-from-sql-server-to-kafka-to-snowflake-with-kafka-connect/&#34;&gt;Streaming data from SQL Server to Kafka to Snowflake ❄️ with Kafka Connect&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;sect1&#34;&gt;&#xA;&lt;h2 id=&#34;_setting_up_the_docker_compose&#34;&gt;Setting up the Docker Compose&lt;/h2&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I like standalone, repeatable, demo code. For that reason I love using Docker Compose and I embed everything in there - connector installation, the kitchen sink - the works.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Including content from external links with Asciidoc in Hugo</title>
      <link>https://rmoff.net/2020/09/18/including-content-from-external-links-with-asciidoc-in-hugo/</link>
      <pubDate>Fri, 18 Sep 2020 09:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/09/18/including-content-from-external-links-with-asciidoc-in-hugo/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I use &lt;a href=&#34;https://gohugo.io/&#34;&gt;Hugo&lt;/a&gt; for my blog, hosted on GitHub pages. One of the reasons I’m really happy with it is that I can use Asciidoc to author my posts. I was writing a blog recently in which I wanted to include some code that’s hosted on GitHub. I could have copied &amp;amp; pasted it into the blog but that would be lame!&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;With Asciidoc you can use the &lt;code&gt;include::&lt;/code&gt; directive to include both local files:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>What is Kafka Connect?</title>
      <link>https://rmoff.net/2020/09/11/what-is-kafka-connect/</link>
      <pubDate>Fri, 11 Sep 2020 16:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/09/11/what-is-kafka-connect/</guid>
      <description>&lt;p&gt;Kafka Connect is the integration API for Apache Kafka. Check out this video for an overview of what Kafka Connect enables you to do, and how to do it.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Counting the number of messages in a Kafka topic</title>
      <link>https://rmoff.net/2020/09/08/counting-the-number-of-messages-in-a-kafka-topic/</link>
      <pubDate>Tue, 08 Sep 2020 10:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/09/08/counting-the-number-of-messages-in-a-kafka-topic/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;There’s ways, and then there’s ways, to count the number of records/events/messages in a Kafka topic. Most of them are potentially inaccurate, or inefficient, or both. Here’s one that falls into the &lt;em&gt;potentially inefficient&lt;/em&gt; category, using &lt;code&gt;kafkacat&lt;/code&gt; to read all the messages and pipe to &lt;code&gt;wc&lt;/code&gt; which with the &lt;code&gt;-l&lt;/code&gt; will tell you how many lines there are, and since each message is a line, how many messages you have in the Kafka topic:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;listingblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;pre class=&#34;rouge highlight&#34; style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;&lt;code data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;$ &lt;/span&gt;kafkacat &lt;span style=&#34;color: #f92672&#34;&gt;-b&lt;/span&gt; broker:29092 &lt;span style=&#34;color: #f92672&#34;&gt;-t&lt;/span&gt; mytestopic &lt;span style=&#34;color: #f92672&#34;&gt;-C&lt;/span&gt; &lt;span style=&#34;color: #f92672&#34;&gt;-e&lt;/span&gt; &lt;span style=&#34;color: #f92672&#34;&gt;-q&lt;/span&gt;| &lt;span style=&#34;color: #f8f8f2&#34;&gt;wc&lt;/span&gt; &lt;span style=&#34;color: #f92672&#34;&gt;-l&lt;/span&gt;&#xA;       3&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Poking around the search engines in Google Chrome</title>
      <link>https://rmoff.net/2020/09/07/poking-around-the-search-engines-in-google-chrome/</link>
      <pubDate>Mon, 07 Sep 2020 23:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/09/07/poking-around-the-search-engines-in-google-chrome/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Google Chrome automagically adds sites that you visit which support searching to a list of custom search engines. For each one you can set a keyword which activates it, so based on the above list if I want to search Amazon I can just type &lt;code&gt;a&lt;/code&gt; &lt;code&gt;&amp;lt;tab&amp;gt;&lt;/code&gt; and then my search term&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2020/09/searchengines02.gif&#34; alt=&#34;searchengines02&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>🤖Building a Telegram bot with Apache Kafka, Go, and ksqlDB</title>
      <link>https://rmoff.net/2020/08/20/building-a-telegram-bot-with-apache-kafka-go-and-ksqldb/</link>
      <pubDate>Thu, 20 Aug 2020 10:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/08/20/building-a-telegram-bot-with-apache-kafka-go-and-ksqldb/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I had the pleasure of presenting at &lt;a href=&#34;https://dataengconf.com.au/&#34;&gt;DataEngBytes&lt;/a&gt; recently, and am delighted to share with you the &lt;strong&gt;🗒️ slides, 👾 code, and 🎥 recording&lt;/strong&gt; of my ✨brand new talk✨:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://rmoff.dev/carpark-telegram-bot&#34;&gt;🤖Building a Telegram bot with Apache Kafka, Go, and ksqlDB&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Telegram bot - BOT_COMMAND_INVALID</title>
      <link>https://rmoff.net/2020/07/23/telegram-bot-bot_command_invalid/</link>
      <pubDate>Thu, 23 Jul 2020 15:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/23/telegram-bot-bot_command_invalid/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;A tiny snippet since I wasted 10 minutes going around the houses on this one…&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;tl;dr: If you try to create a command that is &lt;strong&gt;not in lower case&lt;/strong&gt; (e.g. &lt;code&gt;Alert&lt;/code&gt; not &lt;code&gt;alert&lt;/code&gt;) then the &lt;code&gt;setMyCommands&lt;/code&gt; API will return &lt;code&gt;BOT_COMMAND_INVALID&lt;/code&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S02E09 - Processing chunked responses before EOF is reached</title>
      <link>https://rmoff.net/2020/07/23/learning-golang-some-rough-notes-s02e09-processing-chunked-responses-before-eof-is-reached/</link>
      <pubDate>Thu, 23 Jul 2020 10:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/23/learning-golang-some-rough-notes-s02e09-processing-chunked-responses-before-eof-is-reached/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The server sends &lt;code&gt;Transfer-Encoding: chunked&lt;/code&gt; data, and you want to work with the data &lt;strong&gt;as you get it&lt;/strong&gt;, instead of waiting for the server to finish, the EOF to fire, and &lt;em&gt;then&lt;/em&gt; process the data?&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S02E08 - Checking Kafka advertised.listeners with Go</title>
      <link>https://rmoff.net/2020/07/17/learning-golang-some-rough-notes-s02e08-checking-kafka-advertised.listeners-with-go/</link>
      <pubDate>Fri, 17 Jul 2020 17:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/17/learning-golang-some-rough-notes-s02e08-checking-kafka-advertised.listeners-with-go/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;At the &lt;a href=&#34;https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e00/&#34;&gt;beginning of all this&lt;/a&gt; my aim was to learn something new (Go), and use it to write a version of a utility that I’d previously &lt;a href=&#34;https://github.com/rmoff/kafka-listeners/blob/master/python/python_kafka_test_client.py&#34;&gt;hacked together in Python&lt;/a&gt; that checks your Apache Kafka broker configuration for possible problems with the infamous &lt;code&gt;advertised.listeners&lt;/code&gt; setting. Check out a blog that I wrote which explains &lt;em&gt;&lt;a href=&#34;https://www.confluent.io/blog/kafka-client-cannot-connect-to-broker-on-aws-on-docker-etc&#34;&gt;all about Apache Kafka and listener configuration&lt;/a&gt;&lt;/em&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;You can find the code at &lt;a href=&#34;https://github.com/rmoff/kafka-listeners&#34; class=&#34;bare&#34;&gt;https://github.com/rmoff/kafka-listeners&lt;/a&gt;&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S02E07 - Splitting Go code into separate source files and building a binary executable</title>
      <link>https://rmoff.net/2020/07/16/learning-golang-some-rough-notes-s02e07-splitting-go-code-into-separate-source-files-and-building-a-binary-executable/</link>
      <pubDate>Thu, 16 Jul 2020 11:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/16/learning-golang-some-rough-notes-s02e07-splitting-go-code-into-separate-source-files-and-building-a-binary-executable/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;So far I’ve been running all my code either in the &lt;a href=&#34;https://tour.golang.org/&#34;&gt;Go Tour sandbox&lt;/a&gt;, using &lt;a href=&#34;https://play.golang.org/&#34;&gt;Go Playground&lt;/a&gt;, or from a single file in VS Code. My explorations in the &lt;a href=&#34;https://rmoff.net/2020/07/15/learning-golang-some-rough-notes-s02e06-putting-the-producer-in-a-function-and-handling-errors-in-a-go-routine/&#34;&gt;previous article&lt;/a&gt; ended up with a a source file that was starting to get a little bit unwieldily, so let’s take a look at how that can be improved.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Within my &lt;a href=&#34;https://rmoff.net/2020/07/15/learning-golang-some-rough-notes-s02e06-putting-the-producer-in-a-function-and-handling-errors-in-a-go-routine/&#34;&gt;most recent code&lt;/a&gt;, I have the &lt;code&gt;main&lt;/code&gt; function and the &lt;code&gt;doProduce&lt;/code&gt; function, which is fine when collapsed down:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S02E06 - Putting the Producer in a function and handling errors in a Go routine</title>
      <link>https://rmoff.net/2020/07/15/learning-golang-some-rough-notes-s02e06-putting-the-producer-in-a-function-and-handling-errors-in-a-go-routine/</link>
      <pubDate>Wed, 15 Jul 2020 14:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/15/learning-golang-some-rough-notes-s02e06-putting-the-producer-in-a-function-and-handling-errors-in-a-go-routine/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;When I set out to &lt;a href=&#34;https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e00/&#34;&gt;learn Go&lt;/a&gt; one of the aims I had in mind was to write a version of &lt;a href=&#34;https://github.com/rmoff/kafka-listeners/blob/master/python/python_kafka_test_client.py&#34;&gt;this little Python utility&lt;/a&gt; which accompanies a blog I wrote recently about &lt;a href=&#34;https://www.confluent.io/blog/kafka-client-cannot-connect-to-broker-on-aws-on-docker-etc&#34;&gt;understanding and diagnosing problems with Kafka advertised listeners&lt;/a&gt;. Having successfully got &lt;a href=&#34;https://rmoff.net/2020/07/10/learning-golang-some-rough-notes-s02e02-adding-error-handling-to-the-producer/&#34;&gt;Producer&lt;/a&gt;, &lt;a href=&#34;https://rmoff.net/2020/07/14/learning-golang-some-rough-notes-s02e04-kafka-go-consumer-function-based/&#34;&gt;Consumer&lt;/a&gt;, and &lt;a href=&#34;https://rmoff.net/2020/07/15/learning-golang-some-rough-notes-s02e05-kafka-go-adminclient/&#34;&gt;AdminClient&lt;/a&gt; API examples working, it is now time to turn to that task.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S02E05 - Kafka Go AdminClient</title>
      <link>https://rmoff.net/2020/07/15/learning-golang-some-rough-notes-s02e05-kafka-go-adminclient/</link>
      <pubDate>Wed, 15 Jul 2020 11:00:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/15/learning-golang-some-rough-notes-s02e05-kafka-go-adminclient/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Having ticked off the basics with an Apache Kafka &lt;a href=&#34;https://rmoff.net/2020/07/10/learning-golang-some-rough-notes-s02e02-adding-error-handling-to-the-producer/&#34;&gt;producer&lt;/a&gt; and &lt;a href=&#34;https://rmoff.net/2020/07/14/learning-golang-some-rough-notes-s02e03-kafka-go-consumer-channel-based/&#34;&gt;consumer&lt;/a&gt; in Go, let’s now check out the AdminClient. This is useful for checking out metadata about the cluster, creating topics, and stuff like that.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S02E04 - Kafka Go Consumer (Function-based)</title>
      <link>https://rmoff.net/2020/07/14/learning-golang-some-rough-notes-s02e04-kafka-go-consumer-function-based/</link>
      <pubDate>Tue, 14 Jul 2020 13:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/14/learning-golang-some-rough-notes-s02e04-kafka-go-consumer-function-based/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Last time I looked at creating my &lt;a href=&#34;https://rmoff.net/2020/07/14/learning-golang-some-rough-notes-s02e03-kafka-go-consumer-channel-based/&#34;&gt;first Apache Kafka consumer in Go&lt;/a&gt;, which used the now-deprecated channel-based consumer. Whilst idiomatic for Go, it has some issues which mean that the function-based consumer is recommended for use instead. So let’s go and use it!&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Instead of reading from the &lt;code&gt;Events()&lt;/code&gt; channel of the consumer, we read events using the &lt;a href=&#34;https://docs.confluent.io/5.5.1/clients/confluent-kafka-go/index.html#Consumer.Poll&#34;&gt;&lt;code&gt;Poll()&lt;/code&gt;&lt;/a&gt; function with a timeout. The way we handle events (a &lt;code&gt;switch&lt;/code&gt; based on their &lt;code&gt;type&lt;/code&gt;) is the same:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S02E03 - Kafka Go Consumer (Channel-based)</title>
      <link>https://rmoff.net/2020/07/14/learning-golang-some-rough-notes-s02e03-kafka-go-consumer-channel-based/</link>
      <pubDate>Tue, 14 Jul 2020 11:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/14/learning-golang-some-rough-notes-s02e03-kafka-go-consumer-channel-based/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Having written my first &lt;a href=&#34;https://rmoff.net/2020/07/08/learning-golang-some-rough-notes-s02e01-my-first-kafka-go-producer/&#34;&gt;Kafka producer in Go&lt;/a&gt;, and even &lt;a href=&#34;https://rmoff.net/2020/07/10/learning-golang-some-rough-notes-s02e02-adding-error-handling-to-the-producer/&#34;&gt;added error handling to it&lt;/a&gt;, the next step was to write a consumer. It follows closely the pattern of &lt;a href=&#34;https://rmoff.net/2020/07/10/learning-golang-some-rough-notes-s02e02-adding-error-handling-to-the-producer/&#34;&gt;Producer code I finished up with previously&lt;/a&gt;, using the channel-based approach for the &lt;a href=&#34;https://docs.confluent.io/current/clients/confluent-kafka-go/index.html#Consumer&#34;&gt;Consumer&lt;/a&gt;:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S02E02 - Adding error handling to the Producer</title>
      <link>https://rmoff.net/2020/07/10/learning-golang-some-rough-notes-s02e02-adding-error-handling-to-the-producer/</link>
      <pubDate>Fri, 10 Jul 2020 10:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/10/learning-golang-some-rough-notes-s02e02-adding-error-handling-to-the-producer/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I looked &lt;a href=&#34;https://rmoff.net/2020/07/08/learning-golang-some-rough-notes-s02e01-my-first-kafka-go-producer/&#34;&gt;last time&lt;/a&gt; at the very bare basics of writing a Kafka producer using Go. It worked, but only with everything lined up and pointing the right way. There was no error handling of any sorts. Let’s see about fixing this now.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S02E01 - My First Kafka Go Producer</title>
      <link>https://rmoff.net/2020/07/08/learning-golang-some-rough-notes-s02e01-my-first-kafka-go-producer/</link>
      <pubDate>Wed, 08 Jul 2020 17:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/08/learning-golang-some-rough-notes-s02e01-my-first-kafka-go-producer/</guid>
      <description></description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S02E00 - Kafka and Go</title>
      <link>https://rmoff.net/2020/07/08/learning-golang-some-rough-notes-s02e00-kafka-and-go/</link>
      <pubDate>Wed, 08 Jul 2020 10:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/08/learning-golang-some-rough-notes-s02e00-kafka-and-go/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;With the first leg of my journey with Go &lt;a href=&#34;https://rmoff.net/2020/07/03/learning-golang-some-rough-notes-s01e10-concurrency-web-crawler/&#34;&gt;done&lt;/a&gt; (starting from a &lt;a href=&#34;https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e00/&#34;&gt;&lt;em&gt;very&lt;/em&gt; rudimentary base&lt;/a&gt;), the next step for me was to bring it into my current area of interest and work - Apache Kafka.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S01E10 - Concurrency (Web Crawler)</title>
      <link>https://rmoff.net/2020/07/03/learning-golang-some-rough-notes-s01e10-concurrency-web-crawler/</link>
      <pubDate>Fri, 03 Jul 2020 16:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/03/learning-golang-some-rough-notes-s01e10-concurrency-web-crawler/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;👉 &lt;a href=&#34;https://tour.golang.org/concurrency/9&#34;&gt;A Tour of Go : sync.Mutex&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In the &lt;a href=&#34;https://rmoff.net/2020/07/02/learning-golang-some-rough-notes-s01e09-concurrency-channels-goroutines/&#34;&gt;previous exercise&lt;/a&gt; I felt my &lt;a href=&#34;https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e00/&#34;&gt;absence of a formal CompSci background&lt;/a&gt; with the introduction of Binary Sorted Trees, and now I am concious of it again with learning about mutex. I’d &lt;em&gt;heard&lt;/em&gt; of them before, mostly when Oracle performance folk were talking about wait types - TIL it stands for &lt;code&gt;mutual exclusion&lt;/code&gt;!&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Why JSON isn&#39;t the same as JSON Schema in Kafka Connect converters and ksqlDB (Viewing Kafka messages bytes as hex)</title>
      <link>https://rmoff.net/2020/07/03/why-json-isnt-the-same-as-json-schema-in-kafka-connect-converters-and-ksqldb-viewing-kafka-messages-bytes-as-hex/</link>
      <pubDate>Fri, 03 Jul 2020 08:16:36 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/03/why-json-isnt-the-same-as-json-schema-in-kafka-connect-converters-and-ksqldb-viewing-kafka-messages-bytes-as-hex/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve been playing around with the new SerDes (serialisers/deserialisers) that shipped with Confluent Platform 5.5 - &lt;a href=&#34;https://docs.confluent.io/current/schema-registry/serdes-develop/index.html&#34;&gt;Protobuf, and JSON Schema&lt;/a&gt; (these were added to the existing support for Avro). The serialisers (and associated &lt;a href=&#34;https://docs.confluent.io/current/schema-registry/connect.html&#34;&gt;Kafka Connect converters&lt;/a&gt;) take a payload and serialise it into bytes for sending to Kafka, and I was interested in what those bytes look like. For that I used my favourite Kafka swiss-army knife: kafkacat.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S01E09 - Concurrency (Channels, Goroutines)</title>
      <link>https://rmoff.net/2020/07/02/learning-golang-some-rough-notes-s01e09-concurrency-channels-goroutines/</link>
      <pubDate>Thu, 02 Jul 2020 16:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/02/learning-golang-some-rough-notes-s01e09-concurrency-channels-goroutines/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://tour.golang.org/concurrency/1&#34;&gt;A Tour of Go : Goroutines&lt;/a&gt; was OK but as with some previous material I headed over to &lt;a href=&#34;https://gobyexample.com/goroutines&#34;&gt;Go by example&lt;/a&gt; for clearer explanations.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S01E08 - Images</title>
      <link>https://rmoff.net/2020/07/02/learning-golang-some-rough-notes-s01e08-images/</link>
      <pubDate>Thu, 02 Jul 2020 14:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/02/learning-golang-some-rough-notes-s01e08-images/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;👉 &lt;a href=&#34;https://tour.golang.org/methods/25&#34;&gt;A Tour of Go : Exercise: Images&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This is based on the Picture generator from the &lt;a href=&#34;https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e02-slices/&#34;&gt;Slices exercise&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S01E07 - Readers</title>
      <link>https://rmoff.net/2020/07/01/learning-golang-some-rough-notes-s01e07-readers/</link>
      <pubDate>Wed, 01 Jul 2020 15:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/01/learning-golang-some-rough-notes-s01e07-readers/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;👉 &lt;a href=&#34;https://tour.golang.org/methods/21&#34;&gt;A Tour of Go : Readers&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’m not intending to pick holes in the Tour…but it’s not helping itself ;-)&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;For an introductory text, it makes a ton of assumptions about the user. Here it introduces Readers, and the explanation is good—but the example code looks like this:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S01E06 - Errors</title>
      <link>https://rmoff.net/2020/07/01/learning-golang-some-rough-notes-s01e06-errors/</link>
      <pubDate>Wed, 01 Jul 2020 10:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/07/01/learning-golang-some-rough-notes-s01e06-errors/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;👉 &lt;a href=&#34;https://tour.golang.org/methods/20&#34;&gt;A Tour of Go : Exercise: Errors&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Like Interfaces, the Tour didn’t really do it for me on Errors either. Too absract, and not enough explanation of the code examples for my liking. It also doesn’t cover the &lt;a href=&#34;https://golang.org/pkg/errors/&#34;&gt;&lt;code&gt;errors&lt;/code&gt;&lt;/a&gt; package which other tutorial do. I’m not clear if that’s because the errors package isn’t used much, or the Tour focusses only on teaching the raw basics.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S01E05 - Interfaces</title>
      <link>https://rmoff.net/2020/06/30/learning-golang-some-rough-notes-s01e05-interfaces/</link>
      <pubDate>Tue, 30 Jun 2020 16:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/06/30/learning-golang-some-rough-notes-s01e05-interfaces/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;👉 &lt;a href=&#34;https://tour.golang.org/methods/9&#34;&gt;A Tour of Go : Interfaces&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This page really threw me, for several reasons:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;The text notes that there’s an error (&lt;em&gt;so why don’t they fix it?&lt;/em&gt;)&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;The provided code doesn’t run (presumably because of the above error)&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;It’s not clear if this is a deliberate error to illustrate a point, or just a snafu&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S01E04 - Function Closures</title>
      <link>https://rmoff.net/2020/06/29/learning-golang-some-rough-notes-s01e04-function-closures/</link>
      <pubDate>Mon, 29 Jun 2020 14:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/06/29/learning-golang-some-rough-notes-s01e04-function-closures/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;👉 &lt;a href=&#34;https://tour.golang.org/moretypes/25&#34;&gt;A Tour of Go : Function Closures&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;So far the Tour has been 🤔 and 🧐 and even 🤨 but function closures had me 🤯 …&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Each of the words on the page made sense but strung together in a sentence didn’t really make any sense to me.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S01E03 - Maps</title>
      <link>https://rmoff.net/2020/06/29/learning-golang-some-rough-notes-s01e03-maps/</link>
      <pubDate>Mon, 29 Jun 2020 13:59:05 +0100</pubDate>
      <guid>https://rmoff.net/2020/06/29/learning-golang-some-rough-notes-s01e03-maps/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;👉 &lt;a href=&#34;https://tour.golang.org/moretypes/23&#34;&gt;A Tour of Go : Exercise - Maps&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;quoteblock&#34;&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Implement WordCount&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This is probably bread-and-butter for any seasoned programmer, but I enjoyed the simple process and satisfaction of breaking the problem down into steps to solve using what the tutorial had just covered. Sketching out the logic in pseudo-code first, I figured that I wanted to do this:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S01E02 - Slices</title>
      <link>https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e02-slices/</link>
      <pubDate>Thu, 25 Jun 2020 11:20:23 +0100</pubDate>
      <guid>https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e02-slices/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;&lt;a href=&#34;https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e00/&#34;&gt;Learning Go : Background&lt;/a&gt;&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;hr/&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;👉 &lt;a href=&#34;https://tour.golang.org/moretypes/7&#34;&gt;A Tour of Go : Slices&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Slices made sense, until I got to &lt;a href=&#34;https://tour.golang.org/moretypes/11&#34;&gt;&lt;em&gt;Slice length and capacity&lt;/em&gt;&lt;/a&gt;. Two bits puzzled me in this code:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S01E01 - Pointers</title>
      <link>https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e01-pointers/</link>
      <pubDate>Thu, 25 Jun 2020 11:15:23 +0100</pubDate>
      <guid>https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e01-pointers/</guid>
      <description>&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;&lt;a href=&#34;https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e00/&#34;&gt;Learning Go : Background&lt;/a&gt;&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;hr/&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;👉 &lt;a href=&#34;https://tour.golang.org/moretypes/1&#34;&gt;A Tour of Go : Pointers&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve never used pointers before. Found plenty of good resources about &lt;strong&gt;what&lt;/strong&gt; they are, e.g.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://www.callicoder.com/golang-pointers/&#34; class=&#34;bare&#34;&gt;https://www.callicoder.com/golang-pointers/&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://dave.cheney.net/2017/04/26/understand-go-pointers-in-less-than-800-words-or-your-money-back&#34; class=&#34;bare&#34;&gt;https://dave.cheney.net/2017/04/26/understand-go-pointers-in-less-than-800-words-or-your-money-back&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;But &lt;strong&gt;why&lt;/strong&gt;? It’s like explaining patiently to someone that 2+2 = 4, without really explaining &lt;strong&gt;why&lt;/strong&gt; would we want to add two numbers together in the first place.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Learning Golang (some rough notes) - S01E00</title>
      <link>https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e00/</link>
      <pubDate>Thu, 25 Jun 2020 11:13:23 +0100</pubDate>
      <guid>https://rmoff.net/2020/06/25/learning-golang-some-rough-notes-s01e00/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;My background is not a traditional CompSci one. I studied Music at university, and managed to wangle my way into IT through various means, ending up doing what I do now with no formal training in coding, and a grab-bag of hacky programming attempts on my CV. My weapons of choice have been BBC Basic, VBA, ASP, and more recently some very unpythonic-Python. It’s got me by, but I figured recently I’d like to learn something new, and several people pointed to Go as a good option.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How to install connector plugins in Kafka Connect</title>
      <link>https://rmoff.net/2020/06/19/how-to-install-connector-plugins-in-kafka-connect/</link>
      <pubDate>Fri, 19 Jun 2020 17:28:09 +0100</pubDate>
      <guid>https://rmoff.net/2020/06/19/how-to-install-connector-plugins-in-kafka-connect/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Kafka Connect (which is part of Apache Kafka) supports pluggable connectors, enabling you to stream data between Kafka and numerous types of system, including to mention just a few:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Databases&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Message Queues&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Flat files&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Object stores&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The appropriate plugin for the technology which you want to integrate can be found on &lt;a href=&#34;https://www.confluent.io/hub/&#34;&gt;Confluent Hub&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Loading CSV data into Kafka</title>
      <link>https://rmoff.net/2020/06/17/loading-csv-data-into-kafka/</link>
      <pubDate>Wed, 17 Jun 2020 17:57:18 +0100</pubDate>
      <guid>https://rmoff.net/2020/06/17/loading-csv-data-into-kafka/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;For whatever reason, CSV still exists as a ubiquitous data interchange format. It doesn’t get much simpler: chuck some plaintext with fields separated by commas into a file and stick &lt;code&gt;.csv&lt;/code&gt; on the end. If you’re feeling helpful you can include a header row with field names in.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-csv&#34; data-lang=&#34;csv&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;order_id&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;customer_id&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;order_total_usd&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;make&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;model&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;delivery_city&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;delivery_company&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;delivery_address&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;1&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;535&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;190899.73&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;Dodge&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;Ram Wagon B350&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;Sheffield&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;DuBuque LLC&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;2810 Northland Avenue&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;2&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;671&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;33245.53&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;Volkswagen&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;Cabriolet&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;Edinburgh&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;Bechtelar-VonRueden&lt;/span&gt;,&lt;span style=&#34;color:#ba2121&#34;&gt;1 Macpherson Crossing&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In this article we’ll see how to load this CSV data into Kafka, without even needing to write any code&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How to list and create Kafka topics using the REST Proxy API</title>
      <link>https://rmoff.net/2020/06/05/how-to-list-and-create-kafka-topics-using-the-rest-proxy-api/</link>
      <pubDate>Fri, 05 Jun 2020 09:46:06 +0100</pubDate>
      <guid>https://rmoff.net/2020/06/05/how-to-list-and-create-kafka-topics-using-the-rest-proxy-api/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In v5.5 of Confluent Platform the REST Proxy added new Admin API capabilities, including functionality to list, and create, topics on your cluster.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Check out the &lt;a href=&#34;https://docs.confluent.io/current/kafka-rest/api.html#crest-api-v3&#34;&gt;docs here&lt;/a&gt; and &lt;a href=&#34;https://www.confluent.io/download/#confluent-platform&#34;&gt;download Confluent Platform&lt;/a&gt; here. The REST proxy is &lt;a href=&#34;https://www.confluent.io/confluent-community-license-faq/&#34;&gt;Confluent Community Licenced&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Working with JSON nested arrays in ksqlDB - example</title>
      <link>https://rmoff.net/2020/05/26/working-with-json-nested-arrays-in-ksqldb-example/</link>
      <pubDate>Tue, 26 May 2020 10:02:48 +0100</pubDate>
      <guid>https://rmoff.net/2020/05/26/working-with-json-nested-arrays-in-ksqldb-example/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Question from the Confluent Community Slack group:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;quoteblock&#34;&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;How can I access the data in object in an array like below using ksqlDB stream&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;listingblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;pre class=&#34;rouge highlight&#34; style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;&lt;code data-lang=&#34;json&#34;&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;&#xA;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;font-weight: bold&#34;&gt;&amp;#34;Total&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;[&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;&#xA;        &lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;{&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;&#xA;          &lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;font-weight: bold&#34;&gt;&amp;#34;TotalType&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;Standard&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;&#xA;          &lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;font-weight: bold&#34;&gt;&amp;#34;TotalAmount&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color: #ae81ff&#34;&gt;15.99&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;&#xA;        &lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;},&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;&#xA;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;{&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;&#xA;          &lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;font-weight: bold&#34;&gt;&amp;#34;TotalType&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#34;Old Standard&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;&#xA;          &lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;font-weight: bold&#34;&gt;&amp;#34;TotalAmount&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color: #ae81ff&#34;&gt;16&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;&#xA;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;font-weight: bold&#34;&gt;&amp;#34; STID&amp;#34;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;:&lt;/span&gt;&lt;span style=&#34;color: #ae81ff&#34;&gt;56&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;&#xA;        &lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;}&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2&#34;&gt;&#xA;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;]&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Searching Alfred&#39;s Clipboard history programatically</title>
      <link>https://rmoff.net/2020/05/18/searching-alfreds-clipboard-history-programatically/</link>
      <pubDate>Mon, 18 May 2020 12:46:02 +0100</pubDate>
      <guid>https://rmoff.net/2020/05/18/searching-alfreds-clipboard-history-programatically/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://www.alfredapp.com/&#34;&gt;Alfred&lt;/a&gt; is one of my favourite productivity apps for the Mac. It’s a file indexer, a clipboard manager, a snippet expander - and that’s just scratching the surface really. I recently got a new machine without it installed and realised &lt;em&gt;just how much&lt;/em&gt; I rely on Alfred, particularly its clipboard manager.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Building a Telegram bot with Apache Kafka and ksqlDB</title>
      <link>https://rmoff.net/2020/05/18/building-a-telegram-bot-with-apache-kafka-and-ksqldb/</link>
      <pubDate>Mon, 18 May 2020 11:28:15 +0100</pubDate>
      <guid>https://rmoff.net/2020/05/18/building-a-telegram-bot-with-apache-kafka-and-ksqldb/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Imagine you’ve got a stream of data; it’s not “big data,” but it’s certainly a lot. Within the data, you’ve got some bits you’re interested in, and of those bits, you’d like to be able to query information about them at any point. Sounds fun, right?&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2020/05/telegram_arch02.png&#34; alt=&#34;Architecture high-level view&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;What if you didn’t need any datastore other than Apache Kafka itself to be able to do this? What if you could ingest, filter, enrich, aggregate, and query data with just Kafka? With ksqlDB we can do just this, and I want to show you exactly how, using a Telegram bot as the application looking up state from the inbound stream of events:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Add Markers list from Screenflow to Youtube Table of Contents</title>
      <link>https://rmoff.net/2020/05/04/add-markers-list-from-screenflow-to-youtube-table-of-contents/</link>
      <pubDate>Mon, 04 May 2020 10:20:10 +0100</pubDate>
      <guid>https://rmoff.net/2020/05/04/add-markers-list-from-screenflow-to-youtube-table-of-contents/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Screenflow has a useful Markers feature for adding notes to the timeline.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2020/05/subler01.png&#34; alt=&#34;subler01&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;You can use these to helpfully add a table of contents to your Youtube video, but unfortunately Screenflow doesn’t have the option to export them directly. Instead, use the free &lt;a href=&#34;https://bitbucket.org/galad87/subler/wiki/Home&#34;&gt;Subler&lt;/a&gt; program as an intermediary (download it from &lt;a href=&#34;https://bitbucket.org/galad87/subler/downloads/&#34;&gt;here&lt;/a&gt;).&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;olist arabic&#34;&gt;&#xA;&lt;ol class=&#34;arabic&#34;&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Export from Screenflow with a chapters track&lt;/p&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2020/05/subler02.png&#34; alt=&#34;subler02&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Open the file in Subler and export to text file&lt;/p&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2020/05/subler03.png&#34; alt=&#34;subler03&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;From there, tidy up the text file from the source&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Using Confluent Cloud when there is no Cloud (or internet)</title>
      <link>https://rmoff.net/2020/04/20/using-confluent-cloud-when-there-is-no-cloud-or-internet/</link>
      <pubDate>Mon, 20 Apr 2020 13:55:46 +0100</pubDate>
      <guid>https://rmoff.net/2020/04/20/using-confluent-cloud-when-there-is-no-cloud-or-internet/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://confluent.cloud/signup&#34;&gt;☁️Confluent Cloud&lt;/a&gt; is a great solution for a hosted and managed Apache Kafka service, with the additional benefits of Confluent Platform such as ksqlDB and managed Kafka Connect connectors. But as a developer, you won’t always have a reliable internet connection. Train, planes, and automobiles—not to mention crappy hotel or conference Wi-Fi. Wouldn’t it be useful if you could have a replica of your Cloud data on your local machine? That just pulled down new data automagically, without needing to be restarted each time you got back on the network?&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How to install kafkacat on Fedora</title>
      <link>https://rmoff.net/2020/04/20/how-to-install-kafkacat-on-fedora/</link>
      <pubDate>Mon, 20 Apr 2020 10:25:32 +0100</pubDate>
      <guid>https://rmoff.net/2020/04/20/how-to-install-kafkacat-on-fedora/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://github.com/edenhill/kafkacat&#34;&gt;kafkacat&lt;/a&gt; is one of my go-to tools when working with Kafka. It’s a producer and consumer, but also a swiss-army knife of debugging and troubleshooting capabilities. So when I built a new Fedora server recently, I needed to get it installed. Unfortunately there’s no pre-packed install available on &lt;code&gt;yum&lt;/code&gt;, so here’s how to do it manually.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;&lt;code&gt;kafkacat&lt;/code&gt; is now known as &lt;code&gt;kcat&lt;/code&gt; (&lt;a href=&#34;https://github.com/edenhill/kcat/pull/339&#34;&gt;ref&lt;/a&gt;). When invoking the command you will need to use &lt;strong&gt;&lt;code&gt;kcat&lt;/code&gt;&lt;/strong&gt; in place of &lt;code&gt;kafkacat&lt;/code&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Converting from AsciiDoc to Google Docs and MS Word</title>
      <link>https://rmoff.net/2020/04/16/converting-from-asciidoc-to-google-docs-and-ms-word/</link>
      <pubDate>Thu, 16 Apr 2020 14:27:50 +0100</pubDate>
      <guid>https://rmoff.net/2020/04/16/converting-from-asciidoc-to-google-docs-and-ms-word/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;em&gt;Updated 16 April 2020 to cover formatting tricks &amp;amp; add import to Google Docs info&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Short and sweet this one. I’ve written in the past how&#xA;&lt;a href=&#34;https://rmoff.net/2017/09/12/what-is-markdown-and-why-is-it-awesome/&#34;&gt;I&#xA;love Markdown&lt;/a&gt; but I’ve actually moved on from that and now firmly throw&#xA;my hat in the &lt;a href=&#34;http://www.methods.co.nz/asciidoc/&#34;&gt;AsciiDoc&lt;/a&gt; ring. I’ll&#xA;write another post another time explaining why in more detail, but in&#xA;short it’s just more powerful whilst still simple and readable without&#xA;compilation.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;So anyway, I use AsciiDoc (adoc) for all my technical (and often&#xA;non-technical) writing now, and from there usually dump it out to HTML&#xA;which I can share with people as needed:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>A quick and dirty way to monitor data arriving on Kafka</title>
      <link>https://rmoff.net/2020/04/16/a-quick-and-dirty-way-to-monitor-data-arriving-on-kafka/</link>
      <pubDate>Thu, 16 Apr 2020 00:51:18 +0100</pubDate>
      <guid>https://rmoff.net/2020/04/16/a-quick-and-dirty-way-to-monitor-data-arriving-on-kafka/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve been poking around recently with &lt;a href=&#34;https://rmoff.net/2020/03/11/streaming-wi-fi-trace-data-from-raspberry-pi-to-apache-kafka-with-confluent-cloud/&#34;&gt;capturing Wi-Fi packet data&lt;/a&gt; and streaming it into Apache Kafka, from where I’m processing and analysing it. Kafka itself is rock-solid - because I’m using &lt;a href=&#34;https://confluent.cloud/signup&#34;&gt;☁️Confluent Cloud&lt;/a&gt; and someone else worries about provisioning it, scaling it, and keeping it running for me. But whilst Kafka works just great, my side of the setup—&lt;code&gt;tshark&lt;/code&gt; running on a Raspberry Pi—is less than stable. For whatever reason it sometimes stalls and I have to restart the Raspberry Pi and restart the capture process.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Are Tech Conferences Dead?</title>
      <link>https://rmoff.net/2020/03/13/are-tech-conferences-dead/</link>
      <pubDate>Fri, 13 Mar 2020 22:19:16 +0000</pubDate>
      <guid>https://rmoff.net/2020/03/13/are-tech-conferences-dead/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;🦠COVID-19 has well and truly hit the tech scene this week. As well as being full of &amp;#34;WFH tips&amp;#34; for all the tech workers suddenly banished from their offices, my particular Twitter bubble is full of DevRel folk musing and debating about what this interruption means to our profession. For sure, in the short term, the Spring conference season is screwed— &lt;a href=&#34;https://airtable.com/shrETNURgXNrGWbd8/tblc49hMMykARebo8?blocks=hide&#34;&gt;&lt;strong&gt;all&lt;/strong&gt; the conferences&lt;/a&gt; are cancelled (or postponed).&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;But what about the future? No-one would ever &lt;em&gt;want&lt;/em&gt; to take such a forced hiatus but what an excellent opportunity it is to take a step back and consider why we’re doing what we’re doing - and if we should go back to business as usual once things calm down.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Streaming Wi-Fi trace data from Raspberry Pi to Apache Kafka with Confluent Cloud</title>
      <link>https://rmoff.net/2020/03/11/streaming-wi-fi-trace-data-from-raspberry-pi-to-apache-kafka-with-confluent-cloud/</link>
      <pubDate>Wed, 11 Mar 2020 11:58:13 +0000</pubDate>
      <guid>https://rmoff.net/2020/03/11/streaming-wi-fi-trace-data-from-raspberry-pi-to-apache-kafka-with-confluent-cloud/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Wi-fi is now ubiquitous in most populated areas, and the way the devices communicate leaves a lot of &amp;#39;digital exhaust&amp;#39;. Usually a computer will have a Wi-Fi device that’s configured to connect to a given network, but often these devices can be configured instead to pick up the background Wi-Fi chatter of surrounding devices.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;There are good reasons—and bad—for doing this. Just like taking apart equipment to understand how it works teaches us things, so being able to &lt;a href=&#34;https://rmoff.net/2019/11/29/using-tcpdump-with-docker/&#34;&gt;dissect and examine protocol traffic&lt;/a&gt; lets us learn about this. However, by collecting this type of traffic it can be possible to track and analyse behaviour in ways that we may or may not feel comfortable with. &lt;a href=&#34;https://tfl.gov.uk/corporate/privacy-and-cookies/wi-fi-data-collection&#34;&gt;Improving public transport&lt;/a&gt;? Sure. &lt;a href=&#34;https://www.theguardian.com/technology/datablog/2014/jan/10/how-tracking-customers-in-store-will-soon-be-the-norm&#34;&gt;Tracking shopper behaviour&lt;/a&gt;? Meh, less sure.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kafka Connect JDBC Sink - setting the key field name</title>
      <link>https://rmoff.net/2020/02/25/kafka-connect-jdbc-sink-setting-the-key-field-name/</link>
      <pubDate>Tue, 25 Feb 2020 14:37:12 +0100</pubDate>
      <guid>https://rmoff.net/2020/02/25/kafka-connect-jdbc-sink-setting-the-key-field-name/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I wanted to get some data from a Kafka topic:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-sql&#34; data-lang=&#34;sql&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;ksql&lt;span style=&#34;color:#666&#34;&gt;&amp;gt;&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;PRINT&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;PERSON_STATS&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;FROM&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;BEGINNING;&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;Key&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;format:&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;KAFKA&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;(STRING)&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;Value&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;format:&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;AVRO&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;rowtime:&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;2&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;/&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;25&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;/&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;20&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt;:&lt;span style=&#34;color:#666&#34;&gt;12&lt;/span&gt;:&lt;span style=&#34;color:#666&#34;&gt;51&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;PM&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;UTC,&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;key&lt;/span&gt;:&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;robin,&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;value:&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;&#34;&gt;{&lt;/span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;PERSON&amp;#34;&lt;/span&gt;:&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;robin&amp;#34;&lt;/span&gt;,&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;LOCATION_CHANGES&amp;#34;&lt;/span&gt;:&lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt;,&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;UNIQUE_LOCATIONS&amp;#34;&lt;/span&gt;:&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt;&lt;span style=&#34;&#34;&gt;}&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;into Postgres, so did the easy thing and used Kafka Connect with the &lt;a href=&#34;https://docs.confluent.io/current/connect/kafka-connect-jdbc/sink-connector/index.html&#34;&gt;JDBC Sink connector&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Adventures in the Cloud, Part 94: ECS</title>
      <link>https://rmoff.net/2020/02/13/adventures-in-the-cloud-part-94-ecs/</link>
      <pubDate>Thu, 13 Feb 2020 00:12:23 +0000</pubDate>
      <guid>https://rmoff.net/2020/02/13/adventures-in-the-cloud-part-94-ecs/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;My name’s Robin, and I’m a Developer Advocate. What that means in part is that I build a ton of demos, and Docker Compose is my jam. I love using Docker Compose for the same reasons that many people do:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Spin up and tear down fully-functioning multi-component environments with ease. No bespoke builds, no cloning of VMs to preserve &amp;#34;that magic state where everything works&amp;#34;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Repeatability. It’s the same each time.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Portability. I can point someone at a &lt;code&gt;docker-compose.yml&lt;/code&gt; that I’ve written and they can run the same on their machine with the same results almost guaranteed.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Primitive Keys in ksqlDB</title>
      <link>https://rmoff.net/2020/02/07/primitive-keys-in-ksqldb/</link>
      <pubDate>Fri, 07 Feb 2020 10:58:06 +0000</pubDate>
      <guid>https://rmoff.net/2020/02/07/primitive-keys-in-ksqldb/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;ksqlDB 0.7 will add support for message keys as primitive data types beyond just &lt;code&gt;STRING&lt;/code&gt; (which is all we’ve had to date). That means that Kafka messages are going to be much easier to work with, and require less wrangling to get into the form in which you need them. Take an example of a database table that you’ve ingested into a Kafka topic, and want to join to a stream of events. Previously you’d have had to take the Kafka topic into which the table had been ingested and run a ksqlDB processor to re-key the messages such that ksqlDB could join on them. &lt;em&gt;Friends, I am here to tell you that this is no longer needed!&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Fantastical / Mac Calendar not showing Google Shared Calendar</title>
      <link>https://rmoff.net/2020/01/24/fantastical-/-mac-calendar-not-showing-google-shared-calendar/</link>
      <pubDate>Fri, 24 Jan 2020 11:50:01 +0000</pubDate>
      <guid>https://rmoff.net/2020/01/24/fantastical-/-mac-calendar-not-showing-google-shared-calendar/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Very simple to fix: go to &lt;a href=&#34;https://calendar.google.com/calendar/syncselect&#34; class=&#34;bare&#34;&gt;https://calendar.google.com/calendar/syncselect&lt;/a&gt; and select the calendars that you want. Click save.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Notes on getting data into InfluxDB from Kafka with Kafka Connect</title>
      <link>https://rmoff.net/2020/01/23/notes-on-getting-data-into-influxdb-from-kafka-with-kafka-connect/</link>
      <pubDate>Thu, 23 Jan 2020 12:01:35 +0000</pubDate>
      <guid>https://rmoff.net/2020/01/23/notes-on-getting-data-into-influxdb-from-kafka-with-kafka-connect/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;You can download the InfluxDB connector for Kafka Connect &lt;a href=&#34;https://www.confluent.io/hub/confluentinc/kafka-connect-influxdb&#34;&gt;here&lt;/a&gt;. Documentation for it is &lt;a href=&#34;https://docs.confluent.io/current/connect/kafka-connect-influxdb/influx-db-sink-connector/&#34;&gt;here&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;When a message from your source Kafka topic is written to InfluxDB the InfluxDB values are set thus:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;strong&gt;Timestamp&lt;/strong&gt; is taken from the Kafka message timestamp (which is either set by your producer, or the time at which it was received by the broker)&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;strong&gt;Tag(s)&lt;/strong&gt; are taken from the &lt;code&gt;tags&lt;/code&gt; field in the message. This field must be a &lt;code&gt;map&lt;/code&gt; type - see below&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;strong&gt;Value&lt;/strong&gt; fields are taken from the rest of the message, and must be numeric or boolean&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;strong&gt;Measurement name&lt;/strong&gt; can be specified as a field of the message, or hardcoded in the connector config.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kafka Connect and Schemas</title>
      <link>https://rmoff.net/2020/01/22/kafka-connect-and-schemas/</link>
      <pubDate>Wed, 22 Jan 2020 00:26:03 +0000</pubDate>
      <guid>https://rmoff.net/2020/01/22/kafka-connect-and-schemas/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Here’s a fun one that Kafka Connect can sometimes throw out:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-shell&#34; data-lang=&#34;shell&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;java.lang.ClassCastException: &#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;java.util.HashMap cannot be cast to org.apache.kafka.connect.data.Struct&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;HashMap? Struct? HUH?&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Monitoring Sonos with ksqlDB, InfluxDB, and Grafana</title>
      <link>https://rmoff.net/2020/01/21/monitoring-sonos-with-ksqldb-influxdb-and-grafana/</link>
      <pubDate>Tue, 21 Jan 2020 22:47:35 +0000</pubDate>
      <guid>https://rmoff.net/2020/01/21/monitoring-sonos-with-ksqldb-influxdb-and-grafana/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’m quite a fan of Sonos audio equipment but recently had some trouble with some of the devices glitching and even cutting out whilst playing. Under the covers Sonos stuff is running Linux (of course) and exposes some diagnostics through a rudimentary frontend that you can access at &lt;code&gt;http://&amp;lt;sonos player IP&amp;gt;:1400/support/review&lt;/code&gt;:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2020/01/sonos00.png&#34; alt=&#34;Sonos Network Matrix&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Whilst this gives you the current state, you can’t get historical data on it. It &lt;em&gt;felt&lt;/em&gt; like the problems were happening &amp;#34;all the time&amp;#34;, but &lt;strong&gt;were they actually&lt;/strong&gt;? For that, we need some cold, hard, data! Something like this, in fact:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>UnsupportedClassVersionError: `&lt;x&gt;` has been compiled by a more recent version of the Java Runtime</title>
      <link>https://rmoff.net/2020/01/21/unsupportedclassversionerror-x-has-been-compiled-by-a-more-recent-version-of-the-java-runtime/</link>
      <pubDate>Tue, 21 Jan 2020 22:26:00 +0000</pubDate>
      <guid>https://rmoff.net/2020/01/21/unsupportedclassversionerror-x-has-been-compiled-by-a-more-recent-version-of-the-java-runtime/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This article is just for Googlers and my future self encountering this error. Recently I was building a Docker image from the ksqlDB code base, and whilst it built successfully the ksqlDB server process in the Docker container when instantiated failed with a &lt;code&gt;UnsupportedClassVersionError&lt;/code&gt;:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Changing the Logging Level for Kafka Connect Dynamically</title>
      <link>https://rmoff.net/2020/01/16/changing-the-logging-level-for-kafka-connect-dynamically/</link>
      <pubDate>Thu, 16 Jan 2020 22:50:45 +0000</pubDate>
      <guid>https://rmoff.net/2020/01/16/changing-the-logging-level-for-kafka-connect-dynamically/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Logs are magical things. They tell us what an application is doing—or not doing. They help us debug problems. As it happens, they also underpin the &lt;a href=&#34;https://engineering.linkedin.com/distributed-systems/log-what-every-software-engineer-should-know-about-real-time-datas-unifying&#34;&gt;entire philosophy of Apache Kafka&lt;/a&gt;, but that’s a story for another day. Today we’re talking about logs written by Kafka Connect, and how we can change the amount of detail written.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;By default, Kafka Connect will write logs at &lt;code&gt;INFO&lt;/code&gt; and above. So when it starts up, the settings that it’s using, and any &lt;code&gt;WARN&lt;/code&gt; or &lt;code&gt;ERROR&lt;/code&gt; messages along the way - a missing configuration, a broken connector, and so on. If you want to peer under the covers of what’s happening, perhaps in a given connector, you’d want to see &lt;code&gt;DEBUG&lt;/code&gt; or even &lt;code&gt;TRACE&lt;/code&gt; messages too.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How to win [or at least not suck] at the conference abstract submission game</title>
      <link>https://rmoff.net/2020/01/16/how-to-win-or-at-least-not-suck-at-the-conference-abstract-submission-game/</link>
      <pubDate>Thu, 16 Jan 2020 13:45:31 +0000</pubDate>
      <guid>https://rmoff.net/2020/01/16/how-to-win-or-at-least-not-suck-at-the-conference-abstract-submission-game/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Just over a year ago, I put together the crudely-titled &lt;a href=&#34;https://rmoff.net/2018/12/19/quick-thoughts-on-not-writing-a-crap-abstract/&#34;&gt;&amp;#34;Quick Thoughts on Not Writing a Crap Abstract&amp;#34;&lt;/a&gt; after reviewing a few dozen conference abstracts. This time around I’ve had the honour of being on a conference programme committee and with it the pleasure of reading 250+ abstracts—from which I have some more snarky words of wisdom to impart on the matter.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;sect1&#34;&gt;&#xA;&lt;h2 id=&#34;_remind_mehow_does_this_conference_game_work&#34;&gt;Remind me…how does this conference game work?&lt;/h2&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Before we really get into it, let’s recap how this whole game works, because plenty of people are new to conference speaking.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Exploring ksqlDB window start time</title>
      <link>https://rmoff.net/2020/01/09/exploring-ksqldb-window-start-time/</link>
      <pubDate>Thu, 09 Jan 2020 14:25:01 +0000</pubDate>
      <guid>https://rmoff.net/2020/01/09/exploring-ksqldb-window-start-time/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Prompted by &lt;a href=&#34;https://stackoverflow.com/questions/59629748/ksql-aggregating-data-based-on-last-1-year-365-days&#34;&gt;a question on StackOverflow&lt;/a&gt; I had a bit of a dig into how windows behave in ksqlDB, specifically with regards to their start time. This article shows also how to create test data in ksqlDB and create data to be handled with a timestamp in the past.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;For a general background to windowing in ksqlDB see &lt;a href=&#34;https://docs.ksqldb.io/en/latest/concepts/time-and-windows-in-ksqldb-queries/&#34;&gt;the excellent docs&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The nice thing about recent releases of ksqlDB/KSQL is that you can create and populate streams directly with &lt;code&gt;CREATE STREAM&lt;/code&gt; and &lt;code&gt;INSERT INTO&lt;/code&gt; respectively. Much as I love kafkacat, being able to build a whole example within the ksqlDB CLI is very useful.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Streaming messages from RabbitMQ into Kafka with Kafka Connect</title>
      <link>https://rmoff.net/2020/01/08/streaming-messages-from-rabbitmq-into-kafka-with-kafka-connect/</link>
      <pubDate>Wed, 08 Jan 2020 13:06:57 +0000</pubDate>
      <guid>https://rmoff.net/2020/01/08/streaming-messages-from-rabbitmq-into-kafka-with-kafka-connect/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This was prompted by &lt;a href=&#34;https://stackoverflow.com/questions/59632068/kafka-connect-is-sending-a-malformed-json&#34;&gt;a question&lt;/a&gt; on StackOverflow to which I thought the answer would be straightforward, but turned out not to be so. And then I got a bit carried away and ended up with a nice example of how you can handle schema-less data coming from a system such as RabbitMQ and apply a schema to it.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;This same pattern for ingesting bytes and applying a schema will work with other connectors such as &lt;a href=&#34;https://www.confluent.io/hub/confluentinc/kafka-connect-mqtt&#34;&gt;MQTT&lt;/a&gt;&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Analysing network behaviour with ksqlDB and MongoDB</title>
      <link>https://rmoff.net/2019/12/20/analysing-network-behaviour-with-ksqldb-and-mongodb/</link>
      <pubDate>Fri, 20 Dec 2019 17:23:40 +0000</pubDate>
      <guid>https://rmoff.net/2019/12/20/analysing-network-behaviour-with-ksqldb-and-mongodb/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;In this post I want to build on &lt;a href=&#34;https://rmoff.net/2019/12/18/detecting-and-analysing-ssh-attacks-with-ksqldb/&#34;&gt;my previous one&lt;/a&gt; and show another use of the Syslog data that I’m capturing. Instead of looking for &lt;a href=&#34;https://rmoff.net/2019/12/18/detecting-and-analysing-ssh-attacks-with-ksqldb/&#34;&gt;SSH attacks&lt;/a&gt;, I’m going to analyse the behaviour of my networking components.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;You can find all the code to run this on &lt;a href=&#34;https://github.com/confluentinc/demo-scene/tree/master/syslog&#34;&gt;GitHub&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;sect1&#34;&gt;&#xA;&lt;h2 id=&#34;_getting_syslog_data_into_kafka&#34;&gt;Getting Syslog data into Kafka&lt;/h2&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;As before, let’s create ourselves a &lt;a href=&#34;https://www.confluent.io/hub/confluentinc/kafka-connect-syslog&#34;&gt;syslog connector&lt;/a&gt; in ksqlDB:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;listingblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;pre class=&#34;rouge highlight&#34; style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;&lt;code data-lang=&#34;sql&#34;&gt;&lt;span style=&#34;color: #66d9ef;font-weight: bold&#34;&gt;CREATE&lt;/span&gt; &lt;span style=&#34;color: #66d9ef;font-weight: bold&#34;&gt;SOURCE&lt;/span&gt; &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;CONNECTOR&lt;/span&gt; &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;SOURCE_SYSLOG_UDP_01&lt;/span&gt; &lt;span style=&#34;color: #66d9ef;font-weight: bold&#34;&gt;WITH&lt;/span&gt; &lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;(&lt;/span&gt;&#xA;    &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;tasks.max&amp;#39;&lt;/span&gt; &lt;span style=&#34;color: #f92672;font-weight: bold&#34;&gt;=&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;1&amp;#39;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;    &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;connector.class&amp;#39;&lt;/span&gt; &lt;span style=&#34;color: #f92672;font-weight: bold&#34;&gt;=&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;io.confluent.connect.syslog.SyslogSourceConnector&amp;#39;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;    &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;topic&amp;#39;&lt;/span&gt; &lt;span style=&#34;color: #f92672;font-weight: bold&#34;&gt;=&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;syslog&amp;#39;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;    &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;syslog.port&amp;#39;&lt;/span&gt; &lt;span style=&#34;color: #f92672;font-weight: bold&#34;&gt;=&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;42514&amp;#39;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;    &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;syslog.listener&amp;#39;&lt;/span&gt; &lt;span style=&#34;color: #f92672;font-weight: bold&#34;&gt;=&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;UDP&amp;#39;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;    &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;syslog.reverse.dns.remote.ip&amp;#39;&lt;/span&gt; &lt;span style=&#34;color: #f92672;font-weight: bold&#34;&gt;=&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;true&amp;#39;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;    &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;confluent.license&amp;#39;&lt;/span&gt; &lt;span style=&#34;color: #f92672;font-weight: bold&#34;&gt;=&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;&amp;#39;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;    &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;confluent.topic.bootstrap.servers&amp;#39;&lt;/span&gt; &lt;span style=&#34;color: #f92672;font-weight: bold&#34;&gt;=&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;kafka:29092&amp;#39;&lt;/span&gt;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;,&lt;/span&gt;&#xA;    &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;confluent.topic.replication.factor&amp;#39;&lt;/span&gt; &lt;span style=&#34;color: #f92672;font-weight: bold&#34;&gt;=&lt;/span&gt; &lt;span style=&#34;color: #e6db74&#34;&gt;&amp;#39;1&amp;#39;&lt;/span&gt;&#xA;&lt;span style=&#34;color: #f8f8f2;background-color: #49483e&#34;&gt;);&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Detecting and Analysing SSH Attacks with ksqlDB</title>
      <link>https://rmoff.net/2019/12/18/detecting-and-analysing-ssh-attacks-with-ksqldb/</link>
      <pubDate>Wed, 18 Dec 2019 17:23:40 +0000</pubDate>
      <guid>https://rmoff.net/2019/12/18/detecting-and-analysing-ssh-attacks-with-ksqldb/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve &lt;a href=&#34;https://www.confluent.io/blog/real-time-syslog-processing-apache-kafka-ksql-part-1-filtering/&#34;&gt;written previously&lt;/a&gt; about ingesting Syslog into Kafka and using KSQL to analyse it. I want to revisit the subject since it’s nearly two years since I wrote about it and some things have changed since then.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;ksqlDB now includes the ability to define connectors from within it, which makes setting things up loads easier.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;You can find the &lt;a href=&#34;https://github.com/confluentinc/demo-scene/tree/master/syslog&#34;&gt;full rig to run this on GitHub&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;sect1&#34;&gt;&#xA;&lt;h2 id=&#34;_create_and_configure_the_syslog_connector&#34;&gt;Create and configure the Syslog connector&lt;/h2&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;To start with, create a source connector:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Copy MongoDB collections from remote to local instance</title>
      <link>https://rmoff.net/2019/12/17/copy-mongodb-collections-from-remote-to-local-instance/</link>
      <pubDate>Tue, 17 Dec 2019 20:23:49 +0000</pubDate>
      <guid>https://rmoff.net/2019/12/17/copy-mongodb-collections-from-remote-to-local-instance/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This is revisiting &lt;a href=&#34;https://rmoff.net/2018/03/27/cloning-ubiquitis-mongodb-instance-to-a-separate-server/&#34;&gt;the blog I wrote a while back&lt;/a&gt;, which showed using &lt;code&gt;mongodump&lt;/code&gt; and &lt;code&gt;mongorestore&lt;/code&gt; to copy a MongoDB database from one machine (a Unifi CloudKey) to another. This time instead of a manual lift and shift, I wanted a simple way to automate the update of the target with changes made on the source.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The source is as before, &lt;a href=&#34;https://www.ui.com/unifi/unifi-cloud-key/&#34;&gt;Unifi’s CloudKey&lt;/a&gt;, which runs MongoDB to store its data about the network - devices, access points, events, and so on.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kafka Connect - Request timed out</title>
      <link>https://rmoff.net/2019/11/29/kafka-connect-request-timed-out/</link>
      <pubDate>Fri, 29 Nov 2019 14:37:24 +0000</pubDate>
      <guid>https://rmoff.net/2019/11/29/kafka-connect-request-timed-out/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;A short &amp;amp; sweet blog post to help people Googling for this error, and me next time I encounter it.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The scenario: trying to create a connector in Kafka Connect (running in distributed mode, one worker) failed with the &lt;code&gt;curl&lt;/code&gt; response&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-shell&#34; data-lang=&#34;shell&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;HTTP/1.1 &lt;span style=&#34;color:#666&#34;&gt;500&lt;/span&gt; Internal Server Error&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;Date: Fri, &lt;span style=&#34;color:#666&#34;&gt;29&lt;/span&gt; Nov &lt;span style=&#34;color:#666&#34;&gt;2019&lt;/span&gt; 14:33:53 GMT&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;Content-Type: application/json&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;Content-Length: &lt;span style=&#34;color:#666&#34;&gt;48&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;Server: Jetty&lt;span style=&#34;color:#666&#34;&gt;(&lt;/span&gt;9.4.18.v20190429&lt;span style=&#34;color:#666&#34;&gt;)&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#666&#34;&gt;{&lt;/span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;error_code&amp;#34;&lt;/span&gt;:500,&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;message&amp;#34;&lt;/span&gt;:&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;Request timed out&amp;#34;&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;}&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Using tcpdump With Docker</title>
      <link>https://rmoff.net/2019/11/29/using-tcpdump-with-docker/</link>
      <pubDate>Fri, 29 Nov 2019 11:17:24 +0000</pubDate>
      <guid>https://rmoff.net/2019/11/29/using-tcpdump-with-docker/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I was doing some troubleshooting between two services recently and wanting to poke around to see what was happening in the REST calls between them. Normally I’d reach for &lt;code&gt;tcpdump&lt;/code&gt; to do this but imagine my horror when I saw:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-shell&#34; data-lang=&#34;shell&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;root@ksqldb-server:/# tcpdump&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;bash: tcpdump: &lt;span style=&#34;color:#008000&#34;&gt;command&lt;/span&gt; not found&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Common mistakes made when configuring multiple Kafka Connect workers</title>
      <link>https://rmoff.net/2019/11/22/common-mistakes-made-when-configuring-multiple-kafka-connect-workers/</link>
      <pubDate>Fri, 22 Nov 2019 11:33:48 +0000</pubDate>
      <guid>https://rmoff.net/2019/11/22/common-mistakes-made-when-configuring-multiple-kafka-connect-workers/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Kafka Connect can be deployed in two modes: &lt;strong&gt;Standalone&lt;/strong&gt; or &lt;strong&gt;Distributed&lt;/strong&gt;. You can learn more about them in my &lt;a href=&#34;http://rmoff.dev/ksldn19-kafka-connect&#34;&gt;Kafka Summit London 2019 talk&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I usually recommend &lt;strong&gt;Distributed&lt;/strong&gt; for several reasons:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;It can scale&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;It is fault-tolerant&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;It can be run on a single node sandbox or a multi-node production environment&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;It is the same configuration method however you run it&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I usually find that &lt;strong&gt;Standalone&lt;/strong&gt; is appropriate when:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;You need to guarantee locality of task execution, such as picking up a log file from a folder on a specific machine&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;You don’t care about scale or fault-tolerance ;-)&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;You like re-learning how to configure something when you realise that you &lt;em&gt;do&lt;/em&gt; care about scale or fault-tolerance X-D&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Streaming data from SQL Server to Kafka to Snowflake ❄️ with Kafka Connect</title>
      <link>https://rmoff.net/2019/11/20/streaming-data-from-sql-server-to-kafka-to-snowflake-with-kafka-connect/</link>
      <pubDate>Wed, 20 Nov 2019 17:59:50 +0000</pubDate>
      <guid>https://rmoff.net/2019/11/20/streaming-data-from-sql-server-to-kafka-to-snowflake-with-kafka-connect/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://www.snowflake.com/&#34;&gt;Snowflake&lt;/a&gt; is &lt;em&gt;the data warehouse built for the cloud&lt;/em&gt;, so let’s get all ☁️ cloudy and stream some data from Kafka running in &lt;a href=&#34;https://confluent.cloud&#34;&gt;Confluent Cloud&lt;/a&gt; to Snowflake!&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;What I’m showing also works just as well for an on-premises Kafka cluster. I’m using SQL Server as an example data source, with Debezium to capture and stream and changes from it into Kafka.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2019/11/sf01.png&#34; alt=&#34;sf01&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’m assuming that you’ve signed up for &lt;a href=&#34;https://confluent.cloud/&#34;&gt;Confluent Cloud&lt;/a&gt; and &lt;a href=&#34;https://www.snowflake.com/try-the-data-warehouse-built-for-the-cloud/&#34;&gt;Snowflake&lt;/a&gt; and are the proud owner of credentials for both. I’m going to use a demo rig based on Docker to provision SQL Server and a Kafka Connect worker, but you can use your own setup if you want.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Running Dockerised Kafka Connect worker on GCP</title>
      <link>https://rmoff.net/2019/11/12/running-dockerised-kafka-connect-worker-on-gcp/</link>
      <pubDate>Tue, 12 Nov 2019 14:45:43 +0000</pubDate>
      <guid>https://rmoff.net/2019/11/12/running-dockerised-kafka-connect-worker-on-gcp/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I &lt;a href=&#34;http://talks.rmoff.net/&#34;&gt;talk and write about Kafka and Confluent Platform&lt;/a&gt; a lot, and more and more of the demos that I’m building are around &lt;a href=&#34;https://confluent.cloud&#34;&gt;Confluent Cloud&lt;/a&gt;. This means that I don’t have to run or manage my own Kafka brokers, Zookeeper, Schema Registry, KSQL servers, etc which makes things a ton easier. Whilst there are managed connectors on Confluent Cloud (S3 etc), I need to run my own Kafka Connect worker for those connectors not yet provided. An example is the MQTT source connector that I use in &lt;a href=&#34;https://rmoff.dev/kssf19-ksql-video&#34;&gt;this demo&lt;/a&gt;. Up until now I’d either run this worker locally, or manually build a cloud VM. Locally is fine, as it’s all Docker, easily spun up in a single &lt;code&gt;docker-compose up -d&lt;/code&gt; command. I wanted something that would keep running whilst my laptop was off, but that was as close to my local build as possible—enter GCP and its functionality to run a container on a VM automagically.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;strong&gt;You can see &lt;a href=&#34;https://github.com/confluentinc/demo-scene/blob/master/mqtt-tracker/launch-worker-container_gcloud.sh&#34;&gt;the full script here&lt;/a&gt;&lt;/strong&gt;. The rest of this article just walks through the how and why.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Debezium &amp; MySQL v8 : Public Key Retrieval Is Not Allowed</title>
      <link>https://rmoff.net/2019/10/23/debezium-mysql-v8-public-key-retrieval-is-not-allowed/</link>
      <pubDate>Wed, 23 Oct 2019 11:54:51 -0400</pubDate>
      <guid>https://rmoff.net/2019/10/23/debezium-mysql-v8-public-key-retrieval-is-not-allowed/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I started hitting problems when trying Debezium against MySQL v8. When creating the connector:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Using Kafka Connect and Debezium with Confluent Cloud</title>
      <link>https://rmoff.net/2019/10/16/using-kafka-connect-and-debezium-with-confluent-cloud/</link>
      <pubDate>Wed, 16 Oct 2019 16:29:34 +0100</pubDate>
      <guid>https://rmoff.net/2019/10/16/using-kafka-connect-and-debezium-with-confluent-cloud/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This is based on using &lt;a href=&#34;https://confluent.cloud&#34;&gt;Confluent Cloud&lt;/a&gt; to provide your managed Kafka and Schema Registry. All that you run yourself is the Kafka Connect worker.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Optionally, you can use this &lt;a href=&#34;https://github.com/rmoff/debezium-ccloud/blob/master/docker-compose.yml&#34;&gt;Docker Compose&lt;/a&gt; to run the worker and a sample MySQL database.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Skipping bad records with the Kafka Connect JDBC sink connector</title>
      <link>https://rmoff.net/2019/10/15/skipping-bad-records-with-the-kafka-connect-jdbc-sink-connector/</link>
      <pubDate>Tue, 15 Oct 2019 09:58:38 +0100</pubDate>
      <guid>https://rmoff.net/2019/10/15/skipping-bad-records-with-the-kafka-connect-jdbc-sink-connector/</guid>
      <description>&lt;div id=&#34;preamble&#34;&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The Kafka Connect framework provides generic &lt;a href=&#34;https://www.confluent.io/blog/kafka-connect-deep-dive-error-handling-dead-letter-queues&#34;&gt;error handling and dead-letter queue capabilities&lt;/a&gt; which are available for problems with [de]serialisation and Single Message Transforms. When it comes to errors that a connector may encounter doing the actual &lt;code&gt;pull&lt;/code&gt; or &lt;code&gt;put&lt;/code&gt; of data from the source/target system, it’s down to the connector itself to implement logic around that. For example, the Elasticsearch sink connector provides configuration (&lt;code&gt;behavior.on.malformed.documents&lt;/code&gt;) that can be set so that a single bad record won’t halt the pipeline. Others, such as the JDBC Sink connector, don’t provide this &lt;a href=&#34;https://github.com/confluentinc/kafka-connect-jdbc/issues/721&#34;&gt;yet&lt;/a&gt;. That means that if you hit this problem, you need to manually unblock it yourself. One way is to manually move the offset of the consumer on past the bad message.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;strong&gt;TL;DR&lt;/strong&gt; : You can use &lt;code&gt;kafka-consumer-groups --reset-offsets --to-offset &amp;lt;x&amp;gt;&lt;/code&gt; to manually move the connector past a bad message&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kafka Connect and Elasticsearch</title>
      <link>https://rmoff.net/2019/10/07/kafka-connect-and-elasticsearch/</link>
      <pubDate>Mon, 07 Oct 2019 15:44:59 +0100</pubDate>
      <guid>https://rmoff.net/2019/10/07/kafka-connect-and-elasticsearch/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I use the Elastic stack for a lot of my &lt;a href=&#34;https://talks.rmoff.net/&#34;&gt;talks&lt;/a&gt; and &lt;a href=&#34;https://github.com/confluentinc/demo-scene/&#34;&gt;demos&lt;/a&gt; because it complements Kafka brilliantly. A few things have changed in recent releases and this blog is a quick note on some of the errors that you might hit and how to resolve them. It was inspired by a lot of the comments and discussion &lt;a href=&#34;https://github.com/confluentinc/kafka-connect-elasticsearch/issues/314&#34;&gt;here&lt;/a&gt; and &lt;a href=&#34;https://github.com/confluentinc/kafka-connect-elasticsearch/issues/342&#34;&gt;here&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Copying data between Kafka clusters with Kafkacat</title>
      <link>https://rmoff.net/2019/09/29/copying-data-between-kafka-clusters-with-kafkacat/</link>
      <pubDate>Sun, 29 Sep 2019 10:43:45 +0200</pubDate>
      <guid>https://rmoff.net/2019/09/29/copying-data-between-kafka-clusters-with-kafkacat/</guid>
      <description>&lt;div class=&#34;sect1&#34;&gt;&#xA;&lt;h2 id=&#34;_kafkacat_gives_you_kafka_super_powers&#34;&gt;kafkacat gives you Kafka super powers 😎&lt;/h2&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve &lt;a href=&#34;https://rmoff.net/categories/kcat-kafkacat/&#34;&gt;written before&lt;/a&gt; about &lt;a href=&#34;https://github.com/edenhill/kafkacat&#34;&gt;kafkacat&lt;/a&gt; and what a great tool it is for doing lots of useful things as a developer with Kafka. I used it too in &lt;a href=&#34;https://talks.rmoff.net/8Oruwt/on-track-with-apache-kafka-building-a-streaming-etl-solution-with-rail-data#s9tMEWG&#34;&gt;a recent demo&lt;/a&gt; that I built in which data needed manipulating in a way that I couldn’t easily elsewhere. Today I want share a very simple but powerful use for kafkacat as both a consumer and producer: copying data from one Kafka cluster to another. In this instance it’s getting data from &lt;a href=&#34;https://confluent.cloud/&#34;&gt;Confluent Cloud&lt;/a&gt; down to a local cluster.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kafka Summit GoldenGate bridge run/walk</title>
      <link>https://rmoff.net/2019/09/23/kafka-summit-goldengate-bridge-run/walk/</link>
      <pubDate>Mon, 23 Sep 2019 10:55:23 +0100</pubDate>
      <guid>https://rmoff.net/2019/09/23/kafka-summit-goldengate-bridge-run/walk/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Coming to Kafka Summit in San Francisco next week? Inspired by &lt;a href=&#34;https://www.facebook.com/oraclesqldev/photos/gm.1401265536847886/1228813493825348/?type=3&amp;amp;theater&#34;&gt;similar events&lt;/a&gt; at Oracle OpenWorld in past years, I’m proposing an unofficial run (or walk) across the GoldenGate bridge on the morning of Tuesday 1st October. We should be up and out and back in plenty of time to still attend the morning keynotes. Some people will run, some may prefer to walk, it’s open to everyone :)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Staying sane on the road as a Developer Advocate</title>
      <link>https://rmoff.net/2019/09/19/staying-sane-on-the-road-as-a-developer-advocate/</link>
      <pubDate>Thu, 19 Sep 2019 23:38:42 +0100</pubDate>
      <guid>https://rmoff.net/2019/09/19/staying-sane-on-the-road-as-a-developer-advocate/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve been a full-time Developer Advocate for nearly 1.5 years now, and have learnt lots along the way. The stuff I’ve learnt about being an advocate I’ve written about elsewhere (&lt;a href=&#34;https://rmoff.net/2018/12/19/quick-thoughts-on-not-writing-a-crap-abstract/&#34;&gt;here&lt;/a&gt;/&lt;a href=&#34;https://rmoff.net/2019/03/19/quick-thoughts-on-not-making-a-crap-slide-deck/&#34;&gt;here&lt;/a&gt;/&lt;a href=&#34;https://rmoff.net/2019/03/01/preparing-a-new-talk/&#34;&gt;here&lt;/a&gt;); today I want to write about something that’s just as important: staying sane and looking after yourself whilst on the road. This is also tangentially related to another of my favourite posts that I’ve written: &lt;a href=&#34;https://rmoff.net/2019/02/09/travelling-for-work-with-kids-at-home/&#34;&gt;Travelling for Work, with Kids at Home&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Where I&#39;ll be on the road for the remainder of 2019</title>
      <link>https://rmoff.net/2019/09/02/where-ill-be-on-the-road-for-the-remainder-of-2019/</link>
      <pubDate>Mon, 02 Sep 2019 17:36:01 +0100</pubDate>
      <guid>https://rmoff.net/2019/09/02/where-ill-be-on-the-road-for-the-remainder-of-2019/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve had a relaxing couple of weeks off work over the summer, and came back today to realise that I’ve got a fair bit of conference and meetup travel to wrap my head around for the next few months :)&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;If you’re interested in where I’ll be and want to come and say hi, hear about Kafka—or just grab a coffee or beer, herewith my itinerary as it currently stands.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Reset Kafka Connect Source Connector Offsets</title>
      <link>https://rmoff.net/2019/08/15/reset-kafka-connect-source-connector-offsets/</link>
      <pubDate>Thu, 15 Aug 2019 10:42:34 +0100</pubDate>
      <guid>https://rmoff.net/2019/08/15/reset-kafka-connect-source-connector-offsets/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Kafka Connect in distributed mode uses Kafka itself to persist the offsets of any source connectors. This is a great way to do things as it means that you can easily add more workers, rebuild existing ones, etc without having to worry about where the state is persisted. I personally always recommend using distributed mode, even if just for a single worker instance - it just makes things easier, and more standard. Watch my &lt;a href=&#34;https://www.confluent.io/online-talks/from-zero-to-hero-with-kafka-connect&#34;&gt;talk online here&lt;/a&gt; to understand more about this. If you want to &lt;em&gt;reset&lt;/em&gt; the offset of a source connector then you can do so by &lt;em&gt;very carefully&lt;/em&gt; modifying the data in the Kafka topic itself.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Starting a Kafka Connect sink connector at the end of a topic</title>
      <link>https://rmoff.net/2019/08/09/starting-a-kafka-connect-sink-connector-at-the-end-of-a-topic/</link>
      <pubDate>Fri, 09 Aug 2019 17:11:06 +0200</pubDate>
      <guid>https://rmoff.net/2019/08/09/starting-a-kafka-connect-sink-connector-at-the-end-of-a-topic/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;When you create a sink connector in Kafka Connect, by default it will start reading from the beginning of the topic and stream all of the existing—and new—data to the target. The setting that controls this behaviour is &lt;code&gt;auto.offset.reset&lt;/code&gt;, and you can see its value in the worker log when the connector runs:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-shell&#34; data-lang=&#34;shell&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#666&#34;&gt;[&lt;/span&gt;2019-08-05 23:31:35,405&lt;span style=&#34;color:#666&#34;&gt;]&lt;/span&gt; INFO ConsumerConfig values:&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;        allow.auto.create.topics &lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt; &lt;span style=&#34;color:#008000&#34;&gt;true&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;        auto.commit.interval.ms &lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt; &lt;span style=&#34;color:#666&#34;&gt;5000&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;        auto.offset.reset &lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt; earliest&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;…&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Resetting a Consumer Group in Kafka</title>
      <link>https://rmoff.net/2019/08/09/resetting-a-consumer-group-in-kafka/</link>
      <pubDate>Fri, 09 Aug 2019 16:32:46 +0200</pubDate>
      <guid>https://rmoff.net/2019/08/09/resetting-a-consumer-group-in-kafka/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve been using &lt;a href=&#34;https://docs.confluent.io/current/connect/kafka-connect-replicator/index.html&#34;&gt;Replicator&lt;/a&gt; as a powerful way to copy data from my Kafka rig at home onto my laptop’s Kafka environment. It means that when I’m on the road I can continue to work with the same set of data and develop pipelines etc. With a VPN back home I can even keep them in sync directly if I want to.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I hit a problem the other day where Replicator was running, but I had no data in my target topics on my laptop. After a bit of head-scratching I realised that my local Kafka environment had been rebuilt (I use Docker Compose so complete rebuilds to start from scratch are easy), hence no data in the topic. But, even after restarting the Replicator Kafka Connect worker, I still had no data loaded into the empty topics. What was going on? Well Replicator acts as a consumer from the source Kafka cluster (on my home server), and so far as that Kafka cluster was concerned, Replicator had already read the messages. It thought that because even though I’d rebuilt everything on my laptop, Replicator was using the same connector name as before, and the connector name is used as the Consumer group name - which is how the &lt;em&gt;source&lt;/em&gt; Kafka cluster keeps track of the offsets. So my &amp;#34;new&amp;#34; Kafka environment was going back to the source, which viewed it as the existing &amp;#34;old&amp;#34; one, which had already received the messages.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Migrating Alfred Clipboard to New Laptop</title>
      <link>https://rmoff.net/2019/08/07/migrating-alfred-clipboard-to-new-laptop/</link>
      <pubDate>Wed, 07 Aug 2019 14:23:33 -0700</pubDate>
      <guid>https://rmoff.net/2019/08/07/migrating-alfred-clipboard-to-new-laptop/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Alfred is one of my favourite productivity tools. One of its best features is the clipboard history, which when I moved laptops and it didn’t transfer I realised quite &lt;em&gt;how&lt;/em&gt; much I rely on this functionality in my day-to-day work.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;imageblock&#34;&gt;&#xA;&lt;div class=&#34;content&#34;&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/2019/08/alfred_clipboard.gif&#34; alt=&#34;alfred clipboard&#34;/&gt;&#xA;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Whilst Alfred has the options to syncronise its preferences across machines, it seems that it doesn’t synchronise the clipboard database. To get it to work I did the following:&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>So how DO you make those cool diagrams? July 2019 update</title>
      <link>https://rmoff.net/2019/07/11/so-how-do-you-make-those-cool-diagrams-july-2019-update/</link>
      <pubDate>Thu, 11 Jul 2019 11:12:26 +0200</pubDate>
      <guid>https://rmoff.net/2019/07/11/so-how-do-you-make-those-cool-diagrams-july-2019-update/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I &lt;a href=&#34;https://www.confluent.io/blog/author/robin/&#34;&gt;write&lt;/a&gt; and &lt;a href=&#34;http://talks.rmoff.net/&#34;&gt;speak&lt;/a&gt; lots about Kafka, and get a fair few questions from this. The most common question is actually nothing to do with Kafka, but instead:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;quoteblock&#34;&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;How do you make those cool diagrams?&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I wrote about this originally &lt;a href=&#34;https://rmoff.net/2018/12/10/so-how-do-you-make-those-cool-diagrams/&#34;&gt;last year&lt;/a&gt; but since then have evolved my approach. I’ve now pretty much ditched &lt;a href=&#34;https://paper.bywetransfer.com/&#34;&gt;Paper&lt;/a&gt;, in favour of &lt;a href=&#34;https://concepts.app/en/&#34;&gt;Concepts&lt;/a&gt;. It was recommended to me after I published the previous post.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;blockquote class=&#34;twitter-tweet&#34; data-lang=&#34;en&#34;&gt;&lt;p lang=&#34;en&#34; dir=&#34;ltr&#34;&gt;Robin, check Concepts app. IMHO the best solution on iPad + Pencil setup. Some drawing skills are still needed, but this app is f… amazing :)&lt;/p&gt;— Mariusz Gil (@mariuszgil) &lt;a href=&#34;https://twitter.com/mariuszgil/status/1072520461689503744?ref_src=twsrc%5Etfw&#34;&gt;December 11, 2018&lt;/a&gt;&lt;/blockquote&gt;&#xA;&lt;script async=&#34;&#34; src=&#34;https://platform.twitter.com/widgets.js&#34; charset=&#34;utf-8&#34;&gt;&lt;/script&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I still use the Apple Pencil + iPad, but Concepts is vector-based, supports layers, grids, guidelines…is generally more powerful. Concepts is definitely worth the cost (and learning curve). It doesn’t have &lt;em&gt;quite&lt;/em&gt; the same &amp;#39;hand drawn&amp;#39; feel to it, but it’s close enough for what I want whilst not spending my life re-drawing bitmaps for every permutation of a diagram’s lifecycle :)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Taking the Vienna-Munich sleeper train</title>
      <link>https://rmoff.net/2019/07/03/taking-the-vienna-munich-sleeper-train/</link>
      <pubDate>Wed, 03 Jul 2019 07:17:12 +0200</pubDate>
      <guid>https://rmoff.net/2019/07/03/taking-the-vienna-munich-sleeper-train/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This week I was scheduled in to a couple of meetups, in Vienna and Munich. Flying is an inevitable part of travel since I also happen to &lt;a href=&#34;https://rmoff.net/2019/02/09/travelling-for-work-with-kids-at-home/&#34;&gt;like being home seeing my family&lt;/a&gt; and airplanes are usually the quickest way to make this happen. I don’t particularly enjoy flying, and there’s the environmental impact of it too—so when I realised that Vienna and Munich are &lt;em&gt;relatively&lt;/em&gt; close to each other I looked at getting the train. In the UK trains are generally a few hours, and certainly not overnight (bar a couple of exceptions), so the novelty of getting a sleeper train appealed.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Manually delete a connector from Kafka Connect</title>
      <link>https://rmoff.net/2019/06/23/manually-delete-a-connector-from-kafka-connect/</link>
      <pubDate>Sun, 23 Jun 2019 11:39:46 +0200</pubDate>
      <guid>https://rmoff.net/2019/06/23/manually-delete-a-connector-from-kafka-connect/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Kafka Connect has as &lt;a href=&#34;https://docs.confluent.io/current/connect/references/restapi.html&#34;&gt;REST API&lt;/a&gt; through which all config should be done, including removing connectors that have been created. Sometimes though, you might have reason to want to manually do this—and since Kafka Connect running in distributed mode uses Kafka as its persistent data store, you can achieve this by manually writing to the topic yourself.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Automatically restarting failed Kafka Connect tasks</title>
      <link>https://rmoff.net/2019/06/06/automatically-restarting-failed-kafka-connect-tasks/</link>
      <pubDate>Thu, 06 Jun 2019 17:51:44 +0100</pubDate>
      <guid>https://rmoff.net/2019/06/06/automatically-restarting-failed-kafka-connect-tasks/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Here’s a hacky way to automatically restart Kafka Connect connectors if they fail. Restarting automatically only makes sense if it’s a transient failure; if there’s a problem with your pipeline (e.g. bad records or a mis-configured server) then you don’t gain anything from this. You might want to check out &lt;a href=&#34;https://www.confluent.io/blog/kafka-connect-deep-dive-error-handling-dead-letter-queues&#34;&gt;Kafka Connect’s error handling and dead letter queues&lt;/a&gt; too.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Putting Kafka Connect passwords in a separate file / externalising secrets</title>
      <link>https://rmoff.net/2019/05/24/putting-kafka-connect-passwords-in-a-separate-file-/-externalising-secrets/</link>
      <pubDate>Fri, 24 May 2019 17:30:57 +0100</pubDate>
      <guid>https://rmoff.net/2019/05/24/putting-kafka-connect-passwords-in-a-separate-file-/-externalising-secrets/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Kafka Connect configuration is easy - you just write some JSON! But what if you’ve got credentials that you need to pass? Embedding those in a config file is not always such a smart idea. Fortunately with &lt;a href=&#34;https://cwiki.apache.org/confluence/display/KAFKA/KIP-297%3A+Externalizing+Secrets+for+Connect+Configurations&#34;&gt;KIP-297&lt;/a&gt; which was released in Apache Kafka 2.0 there is support for external secrets. It’s extendable to use your own &lt;code&gt;ConfigProvider&lt;/code&gt;, and ships with its own for just putting credentials in a file - which I’ll show here. You can &lt;a href=&#34;https://docs.confluent.io/current/connect/security.html#externalizing-secrets&#34;&gt;read more here&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Deleting a Connector in Kafka Connect without the REST API</title>
      <link>https://rmoff.net/2019/05/22/deleting-a-connector-in-kafka-connect-without-the-rest-api/</link>
      <pubDate>Wed, 22 May 2019 10:32:10 +0100</pubDate>
      <guid>https://rmoff.net/2019/05/22/deleting-a-connector-in-kafka-connect-without-the-rest-api/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Kafka Connect exposes a &lt;a href=&#34;https://docs.confluent.io/current/connect/references/restapi.html&#34;&gt;REST interface&lt;/a&gt; through which all config and monitoring operations can be done. You can create connectors, delete them, restart them, check their status, and so on. But, I found a situation recently in which I needed to delete a connector and couldn’t do so with the REST API. Here’s another way to do it, by amending the configuration Kafka topic that Kafka Connect in distributed mode uses to persist configuration information for connectors. Note that this is not a recommended way of working with Kafka Connect—the REST API is there for a good reason :)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>A poor man&#39;s KSQL EXPLODE/UNNEST technique</title>
      <link>https://rmoff.net/2019/05/09/a-poor-mans-ksql-explode/unnest-technique/</link>
      <pubDate>Thu, 09 May 2019 10:01:50 +0100</pubDate>
      <guid>https://rmoff.net/2019/05/09/a-poor-mans-ksql-explode/unnest-technique/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;There is an &lt;a href=&#34;https://github.com/confluentinc/ksql/issues/527&#34;&gt;open issue for support of &lt;code&gt;EXPLODE&lt;/code&gt;/&lt;code&gt;UNNEST&lt;/code&gt; functionality in KSQL&lt;/a&gt;, and if you need it then do up-vote the issue. Here I detail a hacky, but effective, workaround for exploding arrays into multiple messages—so long as you know the upper-bound on your array.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>When a Kafka Connect converter is not a _converter_</title>
      <link>https://rmoff.net/2019/05/08/when-a-kafka-connect-converter-is-not-a-_converter_/</link>
      <pubDate>Wed, 08 May 2019 10:06:50 +0100</pubDate>
      <guid>https://rmoff.net/2019/05/08/when-a-kafka-connect-converter-is-not-a-_converter_/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Kafka Connect is a API within Apache Kafka and its modular nature makes it powerful and flexible. Converters are part of the API but not always fully understood. I’ve written previously about &lt;a href=&#34;https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained&#34;&gt;Kafka Connect converters&lt;/a&gt;, and this post is just a hands-on example to show even further what they are—and are not—about.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;admonitionblock note&#34;&gt;&#xA;&lt;table&gt;&#xA;&lt;tbody&gt;&lt;tr&gt;&#xA;&lt;td class=&#34;icon&#34;&gt;&#xA;&lt;i class=&#34;fa icon-note&#34; title=&#34;Note&#34;&gt;&lt;/i&gt;&#xA;&lt;/td&gt;&#xA;&lt;td class=&#34;content&#34;&gt;&#xA;To understand more about Kafka Connect in general, check out my talk from Kafka Summit London &lt;a href=&#34;https://talks.rmoff.net/QZ5nsS/from-zero-to-hero-with-kafka-connect&#34;&gt;&lt;em&gt;From Zero to Hero with Kafka Connect&lt;/em&gt;&lt;/a&gt;.&#xA;&lt;/td&gt;&#xA;&lt;/tr&gt;&#xA;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Reading Kafka Connect Offsets via the REST Proxy</title>
      <link>https://rmoff.net/2019/05/02/reading-kafka-connect-offsets-via-the-rest-proxy/</link>
      <pubDate>Thu, 02 May 2019 10:58:27 +0100</pubDate>
      <guid>https://rmoff.net/2019/05/02/reading-kafka-connect-offsets-via-the-rest-proxy/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;When you run Kafka Connect in distributed mode it uses a Kafka topic to store the offset information for each connector. Because it’s just a Kafka topic, you can read that information using any consumer.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Pivoting Aggregates in Ksql</title>
      <link>https://rmoff.net/2019/04/17/pivoting-aggregates-in-ksql/</link>
      <pubDate>Wed, 17 Apr 2019 15:42:56 +0100</pubDate>
      <guid>https://rmoff.net/2019/04/17/pivoting-aggregates-in-ksql/</guid>
      <description>&lt;p&gt;Prompted by &lt;a href=&#34;https://stackoverflow.com/questions/55680719/aggregating-by-multiple-fields-and-map-to-one-result&#34;&gt;a question on StackOverflow&lt;/a&gt;, the requirement is to take a series of events related to a common key and for each key output a series of aggregates derived from a changing value in the events. I&amp;rsquo;ll use the data from the question, based on ticket statuses. Each ticket can go through various stages, and the requirement was to show, per customer, how many tickets are currently at each stage.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Connecting KSQL to a Secured Schema Registry</title>
      <link>https://rmoff.net/2019/04/12/connecting-ksql-to-a-secured-schema-registry/</link>
      <pubDate>Fri, 12 Apr 2019 12:59:33 +0100</pubDate>
      <guid>https://rmoff.net/2019/04/12/connecting-ksql-to-a-secured-schema-registry/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;See also : &lt;a href=&#34;https://docs.confluent.io/current/ksql/docs/installation/server-config/security.html#configuring-ksql-for-secured-sr-long&#34; class=&#34;bare&#34;&gt;https://docs.confluent.io/current/ksql/docs/installation/server-config/security.html#configuring-ksql-for-secured-sr-long&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Confluent Cloud now includes a secured Schema Registry, which you can use from external applications, including KSQL.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;To configure KSQL for it you need to set:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-shell&#34; data-lang=&#34;shell&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;ksql.schema.registry.url&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;https://&amp;lt;Schema Registry endpoint&amp;gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;ksql.schema.registry.basic.auth.credentials.source&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;USER_INFO&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;ksql.schema.registry.basic.auth.user.info&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;&amp;lt;Schema Registry API Key&amp;gt;:&amp;lt;Schema Registry API Secret&amp;gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Exploring KSQL Stream-Stream Joins</title>
      <link>https://rmoff.net/2019/03/28/exploring-ksql-stream-stream-joins/</link>
      <pubDate>Thu, 28 Mar 2019 14:46:24 +0000</pubDate>
      <guid>https://rmoff.net/2019/03/28/exploring-ksql-stream-stream-joins/</guid>
      <description>&lt;div class=&#34;sect1&#34;&gt;&#xA;&lt;h2 id=&#34;_introduction&#34;&gt;Introduction&lt;/h2&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;What can you use stream-stream joins for? Can you use them to join between a stream of orders and stream of related shipments to do useful things? What’s not supported in KSQL, where are the cracks?&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Terminate All KSQL Queries</title>
      <link>https://rmoff.net/2019/03/25/terminate-all-ksql-queries/</link>
      <pubDate>Mon, 25 Mar 2019 16:45:40 +0000</pubDate>
      <guid>https://rmoff.net/2019/03/25/terminate-all-ksql-queries/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Before you can drop a stream or table that’s populated by a query in KSQL, you have to terminate any queries upon which the object is dependent. Here’s a bit of &lt;code&gt;jq&lt;/code&gt; &amp;amp; &lt;code&gt;xargs&lt;/code&gt; magic to terminate &lt;strong&gt;all&lt;/strong&gt; queries that are currently running&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Quick Thoughts on Not Making a Crap Slide Deck</title>
      <link>https://rmoff.net/2019/03/19/quick-thoughts-on-not-making-a-crap-slide-deck/</link>
      <pubDate>Tue, 19 Mar 2019 10:10:34 +0000</pubDate>
      <guid>https://rmoff.net/2019/03/19/quick-thoughts-on-not-making-a-crap-slide-deck/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This post is the companion to an earlier one that I wrote about &lt;a href=&#34;https://rmoff.net/2018/12/19/quick-thoughts-on-not-writing-a-crap-abstract/&#34;&gt;conference abstracts&lt;/a&gt;. In the same way that the last one was inspired by reviewing a ton of abstracts and noticing a recurring pattern in my suggestions, so this one comes from reviewing a bunch of slide decks for a forthcoming conference. They all look like good talks, but in several cases &lt;em&gt;these great talks are fighting to get out from underneath the deadening weight of slides&lt;/em&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Herewith follows my highly-opinionated, fairly-subjective, and extremely-terse advice and general suggestions for slide decks. You can also find relating ramblings in &lt;a href=&#34;https://rmoff.net/2019/03/01/preparing-a-new-talk/&#34;&gt;this recent post&lt;/a&gt; too. My friend and colleague Vik Gamov also wrote &lt;a href=&#34;https://gamov.io/posts/2019/03/15/quick-tips-on-designing-your-next-presentation.html&#34;&gt;a good post&lt;/a&gt; on this same topic, and linked to &lt;a href=&#34;https://player.oreilly.com/videos/9781491954980&#34;&gt;a good video&lt;/a&gt; that I’d recommend you watch.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Using httpie with the Kafka REST Proxy</title>
      <link>https://rmoff.net/2019/03/08/using-httpie-with-the-kafka-rest-proxy/</link>
      <pubDate>Fri, 08 Mar 2019 15:37:42 +0000</pubDate>
      <guid>https://rmoff.net/2019/03/08/using-httpie-with-the-kafka-rest-proxy/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;This shows how to use &lt;a href=&#34;https://httpie.org/&#34;&gt;httpie&lt;/a&gt; with the &lt;a href=&#34;https://docs.confluent.io/current/kafka-rest/docs/index.html&#34;&gt;Confluent REST Proxy&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;sect1&#34;&gt;&#xA;&lt;h2 id=&#34;_send_data&#34;&gt;Send data&lt;/h2&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-shell&#34; data-lang=&#34;shell&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000&#34;&gt;echo&lt;/span&gt; &lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#39;{&amp;#34;records&amp;#34;:[{&amp;#34;value&amp;#34;:{&amp;#34;foo&amp;#34;:&amp;#34;bar&amp;#34;}}]}&amp;#39;&lt;/span&gt; | &lt;span style=&#34;color:#b62;font-weight:bold&#34;&gt;\&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;  http POST http://localhost:8082/topics/jsontest &lt;span style=&#34;color:#b62;font-weight:bold&#34;&gt;\&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;  Content-Type:application/vnd.kafka.json.v2+json Accept:application/vnd.kafka.v2+json&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Preparing a New Talk</title>
      <link>https://rmoff.net/2019/03/01/preparing-a-new-talk/</link>
      <pubDate>Fri, 01 Mar 2019 11:00:26 +0100</pubDate>
      <guid>https://rmoff.net/2019/03/01/preparing-a-new-talk/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve written quite a few talks over the years, but usually as a side-line to my day job. In my role as a Developer Advocate, talks are part of &lt;em&gt;What I Do&lt;/em&gt;, and so I can dedicate more time to it. A lot of the talks I’ve done previously have evolved through numerous iterations, and with &lt;a href=&#34;https://talks.rmoff.net/the-changing-face-of-etl-event-driven-architectures-for-data-engineers/&#34;&gt;a new talk to deliver&lt;/a&gt; for the &amp;#34;Spring Season&amp;#34; of conferences, I thought it would be interesting to track what it took from concept to actual delivery.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Travelling for Work, with Kids at Home</title>
      <link>https://rmoff.net/2019/02/09/travelling-for-work-with-kids-at-home/</link>
      <pubDate>Sat, 09 Feb 2019 14:13:21 +0000</pubDate>
      <guid>https://rmoff.net/2019/02/09/travelling-for-work-with-kids-at-home/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I began travelling for my job when my first child was three months old. But don’t mistake correlation for causation…it wasn’t the broken nights&amp;#39; sleep that forced me onto the road, but an excellent job opportunity that seemed worth the risk. Nearly eight years later and I’m in a different job but still with a bunch of travel involved. How much I travel has varied. It’s tended to average around 30%, but has peaked at way more than that. I’ve worked in consultancy, business development, and as a developer advocate in that time.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Kafka Connect Change Log Level and Write Log to File</title>
      <link>https://rmoff.net/2019/01/29/kafka-connect-change-log-level-and-write-log-to-file/</link>
      <pubDate>Tue, 29 Jan 2019 11:15:01 -0800</pubDate>
      <guid>https://rmoff.net/2019/01/29/kafka-connect-change-log-level-and-write-log-to-file/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;By default Kafka Connect sends its output to &lt;code&gt;stdout&lt;/code&gt;, so you’ll see it on the console, Docker logs, or wherever. Sometimes you might want to route it to file, and you can do this by reconfiguring log4j. You can also change the configuration to get more (or less) detail in the logs by changing the log level.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;sect1&#34;&gt;&#xA;&lt;h2 id=&#34;_finding_the_log_configuration_file&#34;&gt;Finding the log configuration file&lt;/h2&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The configuration file is called &lt;code&gt;connect-log4j.properties&lt;/code&gt; and usually found in &lt;code&gt;etc/kafka/connect-log4j.properties&lt;/code&gt;.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Replacing UTF8 non-breaking-space with bash/sed on the Mac</title>
      <link>https://rmoff.net/2019/01/21/replacing-utf8-non-breaking-space-with-bash/sed-on-the-mac/</link>
      <pubDate>Mon, 21 Jan 2019 14:01:24 +0000</pubDate>
      <guid>https://rmoff.net/2019/01/21/replacing-utf8-non-breaking-space-with-bash/sed-on-the-mac/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;A script I’d batch-run on my Markdown files had inserted a UTF-8 non-breaking-space between Markdown heading indicator and the text, which meant that &lt;code&gt;&lt;mark&gt;#&lt;/mark&gt; My title&lt;/code&gt; actually got rendered as that, instead of an H3 title.&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Looking at the file contents, I could see it wasn’t just a space between the &lt;code&gt;#&lt;/code&gt; and the text, but a non-breaking space.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>How KSQL handles case</title>
      <link>https://rmoff.net/2019/01/21/how-ksql-handles-case/</link>
      <pubDate>Mon, 21 Jan 2019 12:05:48 +0000</pubDate>
      <guid>https://rmoff.net/2019/01/21/how-ksql-handles-case/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://www.confluent.io/ksql&#34;&gt;KSQL&lt;/a&gt; is generally case-sensitive. Very sensitive, at times ;-)&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>KSQL REST API cheatsheet</title>
      <link>https://rmoff.net/2019/01/17/ksql-rest-api-cheatsheet/</link>
      <pubDate>Thu, 17 Jan 2019 12:12:11 +0000</pubDate>
      <guid>https://rmoff.net/2019/01/17/ksql-rest-api-cheatsheet/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Full reference is &lt;a href=&#34;https://docs.confluent.io/current/ksql/docs/developer-guide/api.html&#34;&gt;here&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Confluent Schema Registry REST API cheatsheet</title>
      <link>https://rmoff.net/2019/01/17/confluent-schema-registry-rest-api-cheatsheet/</link>
      <pubDate>Thu, 17 Jan 2019 11:25:40 +0000</pubDate>
      <guid>https://rmoff.net/2019/01/17/confluent-schema-registry-rest-api-cheatsheet/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;The &lt;a href=&#34;https://docs.confluent.io/current/schema-registry/docs/index.html&#34;&gt;Schema Registry&lt;/a&gt; support a &lt;a href=&#34;https://docs.confluent.io/current/schema-registry/docs/api.html&#34;&gt;REST API&lt;/a&gt; for finding out information about the schemas within it. Here’s a quick cheatsheat with REST calls that I often use.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>What to Do When Docker on the Mac Runs Out of Space</title>
      <link>https://rmoff.net/2019/01/09/what-to-do-when-docker-on-the-mac-runs-out-of-space/</link>
      <pubDate>Wed, 09 Jan 2019 10:18:20 +0000</pubDate>
      <guid>https://rmoff.net/2019/01/09/what-to-do-when-docker-on-the-mac-runs-out-of-space/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I use Docker and Docker Compose &lt;em&gt;a lot&lt;/em&gt;. Like, every day. It’s a fantastic way to build repeatable demos and examples, that can be torn down and spun up in a repeatable way. But…what happens when the demo that was working is spun up and then tail spins down in a blaze of flames?&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Quick Thoughts on Not Writing a Crap Abstract</title>
      <link>https://rmoff.net/2018/12/19/quick-thoughts-on-not-writing-a-crap-abstract/</link>
      <pubDate>Wed, 19 Dec 2018 22:26:04 +0000</pubDate>
      <guid>https://rmoff.net/2018/12/19/quick-thoughts-on-not-writing-a-crap-abstract/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve reviewed a bunch of abstracts in the last couple of days, here are some common suggestions I made:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;ulist&#34;&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;No need to include your company name in the abstract text. Chances are I’ve not heard of your company, and even if I have, what does it add to my comprehension of your abstract and what you’re going to talk about? Possible exception would be the &amp;#34;hot&amp;#34; tech companies where people will see a talk just because it’s Netflix etc&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;I &lt;em&gt;really&lt;/em&gt; don’t want just to read your project documentation/summary. It makes me worry your talk will be death by PowerPoint of the minutiae of something that’s only relevant in your company.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Following on from above, I want to see that there’s going to be things you’ll share &lt;em&gt;that are useful for other people in a similar situation&lt;/em&gt;. Something that’s specific to your project, your company, doesn’t translate to mass-usefulness. Something that other people will hit, whether it’s technical or org-cultural, now &lt;em&gt;that&lt;/em&gt; is interesting and is going to be useful&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;If my eyes start to glaze over reading the abstract intro, already I’m assuming that your talk will make me bored too. Read it back out loud to yourself…make sure each word justifies its place in the text. Boilerplate filler and waffle should be left on the cutting room floor.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;You need to strike a balance between giving enough detail about the contents of your talk that I am convinced you have interesting things to share, but without listing every nut and bolt of detail. Too much detail and it just becomes a laundry list. You need to whet people’s appetite for the actual meal, not put them off their food.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;For heaven’s sake, proof read! If you can’t be arsed to use a spell checker, then I definitely wouldn’t trust you to prepare a talk of any quality. I’ve recently started using &lt;a href=&#34;https://app.grammarly.com/&#34;&gt;Grammarly&lt;/a&gt; and it’s &lt;em&gt;excellent&lt;/em&gt;.&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Moving from Ghost to Hugo</title>
      <link>https://rmoff.net/2018/12/17/moving-from-ghost-to-hugo/</link>
      <pubDate>Mon, 17 Dec 2018 23:00:21 +0000</pubDate>
      <guid>https://rmoff.net/2018/12/17/moving-from-ghost-to-hugo/</guid>
      <description>&lt;div class=&#34;sect1&#34;&gt;&#xA;&lt;h2 id=&#34;_why&#34;&gt;Why?&lt;/h2&gt;&#xA;&lt;div class=&#34;sectionbody&#34;&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;I’ve been blogging for quite a few years now, starting on Blogger, soon onto &lt;a href=&#34;https://rnm1978.wordpress.com/&#34;&gt;WordPress&lt;/a&gt;, and then to Ghost a couple of years ago. Blogger was fairly lame, WP yucky, but I really do like Ghost. It’s simple and powerful and was &lt;em&gt;perfect&lt;/em&gt; for my needs. My needs being, an outlet for technical content that respected formatting, worked with a markup language (Markdown), and didn’t &lt;em&gt;f**k things up&lt;/em&gt; in the way that WP often would in its WYSIWYG handling of content.&lt;/p&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Pull new version of multiple Docker images</title>
      <link>https://rmoff.net/2018/12/17/pull-new-version-of-multiple-docker-images/</link>
      <pubDate>Mon, 17 Dec 2018 17:44:02 +0000</pubDate>
      <guid>https://rmoff.net/2018/12/17/pull-new-version-of-multiple-docker-images/</guid>
      <description>&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Tiny little snippet this one. Given a list of images:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-shell&#34; data-lang=&#34;shell&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;$ docker images|grep confluent&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;confluentinc/cp-enterprise-kafka                5.0.0               d0c5528d7f99        &lt;span style=&#34;color:#666&#34;&gt;3&lt;/span&gt; months ago        600MB&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;confluentinc/cp-kafka                           5.0.0               373a4e31e02e        &lt;span style=&#34;color:#666&#34;&gt;3&lt;/span&gt; months ago        558MB&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;confluentinc/cp-zookeeper                       5.0.0               3cab14034c43        &lt;span style=&#34;color:#666&#34;&gt;3&lt;/span&gt; months ago        558MB&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;confluentinc/cp-ksql-server                     5.0.0               691bc3c1991f        &lt;span style=&#34;color:#666&#34;&gt;4&lt;/span&gt; months ago        493MB&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;confluentinc/cp-ksql-cli                        5.0.0               e521f3e787d6        &lt;span style=&#34;color:#666&#34;&gt;4&lt;/span&gt; months ago        488MB&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;…&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;p&gt;Now there’s a new version available, and you want to pull down all the latest ones for it:&lt;/p&gt;&#xA;&lt;/div&gt;&#xA;&lt;div class=&#34;paragraph&#34;&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-shell&#34; data-lang=&#34;shell&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;docker images|grep &lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;^confluentinc&amp;#34;&lt;/span&gt;|awk &lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#39;{print $1}&amp;#39;&lt;/span&gt;|xargs -Ifoo docker pull foo:5.1.0&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&#xA;&lt;/div&gt;</description>
    </item>
    <item>
      <title>Docker Tips and Tricks with Kafka Connect, ksqlDB, and Kafka</title>
      <link>https://rmoff.net/2018/12/15/docker-tips-and-tricks-with-kafka-connect-ksqldb-and-kafka/</link>
      <pubDate>Sat, 15 Dec 2018 22:00:55 +0000</pubDate>
      <guid>https://rmoff.net/2018/12/15/docker-tips-and-tricks-with-kafka-connect-ksqldb-and-kafka/</guid>
      <description>&lt;p&gt;A few years ago a colleague of mine told me about this thing called Docker, and I must admit I dismissed it as a fad…how wrong was I. Docker, and Docker Compose, are one of my key tools of the trade. With them I can build self-contained environments for tutorials, demos, conference talks etc. Tear it down, run it again, without worrying that somewhere a local config changed and will break things.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Streaming data from Oracle into Kafka</title>
      <link>https://rmoff.net/2018/12/12/streaming-data-from-oracle-into-kafka/</link>
      <pubDate>Wed, 12 Dec 2018 09:49:04 +0000</pubDate>
      <guid>https://rmoff.net/2018/12/12/streaming-data-from-oracle-into-kafka/</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a short summary discussing what the options are for integrating Oracle RDBMS into Kafka, as of December 2018 (refreshed June 2020). For a more detailed background to why and how at a broader level for all databases (not just Oracle) see &lt;a href=&#34;http://cnfl.io/kafka-cdc&#34;&gt;this blog&lt;/a&gt; and &lt;a href=&#34;http://rmoff.dev/ksny19-no-more-silos&#34;&gt;this talk&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;div style=&#34;position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden;&#34;&gt;&#xA;      &lt;iframe allow=&#34;accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share; fullscreen&#34; loading=&#34;eager&#34; referrerpolicy=&#34;strict-origin-when-cross-origin&#34; src=&#34;https://www.youtube.com/embed/LAoepZTapMM?autoplay=0&amp;amp;controls=1&amp;amp;end=0&amp;amp;loop=0&amp;amp;mute=0&amp;amp;start=0&#34; style=&#34;position: absolute; top: 0; left: 0; width: 100%; height: 100%; border:0;&#34; title=&#34;YouTube video&#34;&gt;&lt;/iframe&gt;&#xA;    &lt;/div&gt;&#xA;&#xA;&lt;h3 id=&#34;what-techniques--tools-are-there&#34;&gt;What techniques &amp;amp; tools are there?&lt;/h3&gt;&#xA;&lt;p&gt;&lt;em&gt;Franck Pachot has written up an excellent analysis of the options available &lt;a href=&#34;https://medium.com/@FranckPachot/ideas-for-event-sourcing-in-oracle-d4e016e90af6&#34;&gt;here&lt;/a&gt;&lt;/em&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Tools I Use: iPad Pro</title>
      <link>https://rmoff.net/2018/12/11/tools-i-use-ipad-pro/</link>
      <pubDate>Tue, 11 Dec 2018 15:12:15 +0000</pubDate>
      <guid>https://rmoff.net/2018/12/11/tools-i-use-ipad-pro/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve written recently about &lt;a href=&#34;https://rmoff.net/2018/12/10/so-how-do-you-make-those-cool-diagrams/&#34;&gt;how I create the diagrams in my blog posts and talks&lt;/a&gt;, and from discussions around that, a couple of people were interested more broadly in how I use my iPad Pro. So, on the basis that if two people are interested maybe others are (and if no-one else is, I have a copy-and-paste answer to give to those two people) here we go.&lt;/p&gt;&#xA;&lt;h3 id=&#34;kit&#34;&gt;Kit&lt;/h3&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;iPad Pro 10.5&amp;quot; (2018)&#xA;&lt;ul&gt;&#xA;&lt;li&gt;256GB model&lt;/li&gt;&#xA;&lt;li&gt;Apple Pencil&lt;/li&gt;&#xA;&lt;li&gt;Apple Keyboard&lt;/li&gt;&#xA;&lt;li&gt;iPad wallet/protector&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://www.amazon.co.uk/gp/product/B073X5BML2&#34;&gt;Matte screen protector&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h3 id=&#34;background&#34;&gt;Background&lt;/h3&gt;&#xA;&lt;p&gt;I travel quite a lot for work, so want something with a decent battery life for stuff like:&lt;/p&gt;</description>
    </item>
    <item>
      <title>So how DO you make those cool diagrams?</title>
      <link>https://rmoff.net/2018/12/10/so-how-do-you-make-those-cool-diagrams/</link>
      <pubDate>Mon, 10 Dec 2018 12:38:18 +0000</pubDate>
      <guid>https://rmoff.net/2018/12/10/so-how-do-you-make-those-cool-diagrams/</guid>
      <description>&lt;p&gt;I &lt;a href=&#34;https://www.confluent.io/blog/author/robin/&#34;&gt;write&lt;/a&gt; and &lt;a href=&#34;https://rmoff.net/presentations/&#34;&gt;speak&lt;/a&gt; lots about Kafka, and get a fair few questions from this. The most common question is actually nothing to do with Kafka, but instead:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;How do you make those cool diagrams?&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;So here&amp;rsquo;s a short, and longer, answer!&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;h2 id=&#34;update-july-2019&#34;&gt;Update July 2019&lt;/h2&gt;&#xA;&lt;p&gt;I&amp;rsquo;ve moved away from Paper -&amp;gt; &lt;a href=&#34;https://rmoff.net/2019/07/11/so-how-do-you-make-those-cool-diagrams-july-2019-update/&#34;&gt;read more here&lt;/a&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;hr&gt;&#xA;&lt;h3 id=&#34;tldr&#34;&gt;tl;dr&lt;/h3&gt;&#xA;&lt;p&gt;An iOS app called &lt;a href=&#34;https://paper.bywetransfer.com/&#34;&gt;Paper, from a company called FiftyThree&lt;/a&gt;&lt;/p&gt;&#xA;&lt;h3 id=&#34;so-how-do-you-make-those-cool-diagrams&#34;&gt;So, how DO you make those cool diagrams?&lt;/h3&gt;&#xA;&lt;p&gt;&lt;em&gt;Disclaimer: This is a style that I have copied straight from my esteemed colleagues at Confluent, including Neha Narkhede and Ben Stopford, as well as others including Martin Kleppmann.&lt;/em&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Get mtr working on the Mac</title>
      <link>https://rmoff.net/2018/12/08/get-mtr-working-on-the-mac/</link>
      <pubDate>Sat, 08 Dec 2018 12:45:40 +0000</pubDate>
      <guid>https://rmoff.net/2018/12/08/get-mtr-working-on-the-mac/</guid>
      <description>&lt;h3 id=&#34;install&#34;&gt;Install&lt;/h3&gt;&#xA;&lt;p&gt;Not sure why the &lt;code&gt;brew&lt;/code&gt; doesn&amp;rsquo;t work as it used to, but here&amp;rsquo;s how to get it working:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;brew install mtr&#xA;sudo ln /usr/local/Cellar/mtr/0.92/sbin/mtr /usr/local/bin/mtr&#xA;sudo ln /usr/local/Cellar/mtr/0.92/sbin/mtr-packet /usr/local/bin/mtr-packet&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;&lt;em&gt;(If you don&amp;rsquo;t do the two symbolic links (&lt;code&gt;ln&lt;/code&gt;) you&amp;rsquo;ll get &lt;code&gt;mtr: command not found&lt;/code&gt; or &lt;code&gt;mtr: Failure to start mtr-packet: Invalid argument&lt;/code&gt;)&lt;/em&gt;&lt;/p&gt;&#xA;&lt;h3 id=&#34;run&#34;&gt;Run&lt;/h3&gt;&#xA;&lt;pre&gt;&lt;code&gt;sudo mtr google.com&#xA;&lt;/code&gt;&lt;/pre&gt;</description>
    </item>
    <item>
      <title>Kafka Connect CLI tricks</title>
      <link>https://rmoff.net/2018/12/03/kafka-connect-cli-tricks/</link>
      <pubDate>Mon, 03 Dec 2018 14:50:45 +0000</pubDate>
      <guid>https://rmoff.net/2018/12/03/kafka-connect-cli-tricks/</guid>
      <description>&lt;p&gt;I do lots of work with Kafka Connect, almost entirely in &lt;a href=&#34;https://docs.confluent.io/current/connect/concepts.html#distributed-workers&#34;&gt;Distributed mode&lt;/a&gt;—even just with 1 node -&amp;gt; makes scaling out much easier when/if needed. Because I&amp;rsquo;m using Distributed mode, I use the &lt;a href=&#34;https://docs.confluent.io/current/connect/references/restapi.html&#34;&gt;Kafka Connect REST API&lt;/a&gt; to configure and manage it. Whilst others might use GUI REST tools like Postman etc, I tend to just use the commandline. Here are some useful snippets that I use all the time.&lt;/p&gt;&#xA;&lt;p&gt;I&amp;rsquo;m showing the commands split with a line continuation character (&lt;code&gt;\&lt;/code&gt;) but you can of course run them on a single line. You might also choose to get fancy and set the Connect host and port as environment variables etc, but I leave that as an exercise for the reader :)&lt;/p&gt;</description>
    </item>
    <item>
      <title>ERROR: Invalid interpolation format for &#34;command&#34; option in service…</title>
      <link>https://rmoff.net/2018/11/20/error-invalid-interpolation-format-for-command-option-in-service/</link>
      <pubDate>Tue, 20 Nov 2018 17:47:54 +0000</pubDate>
      <guid>https://rmoff.net/2018/11/20/error-invalid-interpolation-format-for-command-option-in-service/</guid>
      <description>&lt;p&gt;Doing some funky Docker Compose stuff, including:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Flatten CDC records in KSQL</title>
      <link>https://rmoff.net/2018/10/11/flatten-cdc-records-in-ksql/</link>
      <pubDate>Thu, 11 Oct 2018 15:13:59 +0000</pubDate>
      <guid>https://rmoff.net/2018/10/11/flatten-cdc-records-in-ksql/</guid>
      <description>&lt;h3 id=&#34;the-problem---nested-messages-in-kafka&#34;&gt;The problem - nested messages in Kafka&lt;/h3&gt;&#xA;&lt;p&gt;Data comes into Kafka in many shapes and sizes. Sometimes it&amp;rsquo;s from CDC tools, and may be nested like this:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Exploring JMX with jmxterm</title>
      <link>https://rmoff.net/2018/09/19/exploring-jmx-with-jmxterm/</link>
      <pubDate>Wed, 19 Sep 2018 08:11:00 +0000</pubDate>
      <guid>https://rmoff.net/2018/09/19/exploring-jmx-with-jmxterm/</guid>
      <description>&lt;ul&gt;&#xA;&lt;li&gt;Check out the &lt;a href=&#34;https://github.com/jiaqi/jmxterm/&#34;&gt;jmxterm repository&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;Download jmxterm from &lt;a href=&#34;https://docs.cyclopsgroup.org/jmxterm&#34;&gt;https://docs.cyclopsgroup.org/jmxterm&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>Accessing Kafka Docker containers&#39; JMX from host</title>
      <link>https://rmoff.net/2018/09/17/accessing-kafka-docker-containers-jmx-from-host/</link>
      <pubDate>Mon, 17 Sep 2018 15:29:48 +0000</pubDate>
      <guid>https://rmoff.net/2018/09/17/accessing-kafka-docker-containers-jmx-from-host/</guid>
      <description>&lt;p&gt;&lt;em&gt;See also &lt;a href=&#34;https://docs.confluent.io/current/installation/docker/docs/operations/monitoring.html&#34;&gt;docs&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;p&gt;To help future Googlers… with the Confluent docker images for Kafka, KSQL, Kafka Connect, etc, if you want to access JMX metrics from within, you just need to pass two environment variables: &lt;code&gt;&amp;lt;x&amp;gt;_JMX_HOSTNAME&lt;/code&gt; and &lt;code&gt;&amp;lt;x&amp;gt;_JMX_PORT&lt;/code&gt;, prefixed by a component name.&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;code&gt;&amp;lt;x&amp;gt;_JMX_HOSTNAME&lt;/code&gt; - the hostname/IP of the &lt;em&gt;JMX host&lt;/em&gt; machine, &lt;em&gt;as accessible from the JMX Client&lt;/em&gt;.&lt;/p&gt;&#xA;&lt;p&gt;This is used by the JMX client to connect back into JMX, so must be accessible from the &lt;em&gt;host machine running the JMX client&lt;/em&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Sending multiline messages to Kafka</title>
      <link>https://rmoff.net/2018/09/04/sending-multiline-messages-to-kafka/</link>
      <pubDate>Tue, 04 Sep 2018 08:26:51 +0000</pubDate>
      <guid>https://rmoff.net/2018/09/04/sending-multiline-messages-to-kafka/</guid>
      <description>&lt;p&gt;(&lt;a href=&#34;https://stackoverflow.com/questions/52151816/push-multiple-line-text-as-one-message-in-a-kafka-topic/52162998#52162998&#34;&gt;SO answer repost&lt;/a&gt;)&lt;/p&gt;&#xA;&lt;p&gt;You can use &lt;a href=&#34;https://docs.confluent.io/current/app-development/kafkacat-usage.html&#34;&gt;&lt;code&gt;kafkacat&lt;/code&gt;&lt;/a&gt; to send messages to Kafka that include line breaks. To do this, use its &lt;code&gt;-D&lt;/code&gt; operator to specify a custom message delimiter (in this example &lt;code&gt;/&lt;/code&gt;):&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;kafkacat -b kafka:29092 \&#xA;        -t test_topic_01 \&#xA;        -D/ \&#xA;        -P &amp;lt;&amp;lt;EOF&#xA;this is a string message &#xA;with a line break/this is &#xA;another message with two &#xA;line breaks!&#xA;EOF&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;&lt;em&gt;Note that the delimiter &lt;strong&gt;must&lt;/strong&gt; be a single byte - multi-byte chars will end up getting included in the resulting message &lt;a href=&#34;https://github.com/edenhill/kafkacat/issues/140&#34;&gt;See issue #140&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Window Timestamps in KSQL / Integration with Elasticsearch</title>
      <link>https://rmoff.net/2018/09/03/window-timestamps-in-ksql-/-integration-with-elasticsearch/</link>
      <pubDate>Mon, 03 Sep 2018 16:16:30 +0000</pubDate>
      <guid>https://rmoff.net/2018/09/03/window-timestamps-in-ksql-/-integration-with-elasticsearch/</guid>
      <description>&lt;p&gt;KSQL provides the ability to create windowed aggregations. For example,&#xA;count the number of messages in a 1 minute window, grouped by a&#xA;particular column:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;CREATE TABLE RATINGS_BY_CLUB_STATUS AS \&#xA;SELECT CLUB_STATUS, COUNT(*) AS RATING_COUNT \&#xA;FROM RATINGS_WITH_CUSTOMER_DATA \&#xA;     WINDOW TUMBLING (SIZE 1 MINUTES) \&#xA;GROUP BY CLUB_STATUS;&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;How KSQL, and Kafka Streams, stores the window timestamp associated with&#xA;an aggregate, has recently changed. &lt;a href=&#34;https://github.com/confluentinc/ksql/issues/1497&#34;&gt;See #1497 for&#xA;details&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;Whereas previously the &lt;em&gt;Kafka message timestamp&lt;/em&gt; (accessible through the&#xA;KSQL &lt;code&gt;ROWTIME&lt;/code&gt; system column) stored the start of the window for which&#xA;the aggregate had been calculated, this changed in July 2018 to instead&#xA;be the timestamp of the latest message to update that aggregate value.&#xA;This was in Apache Kafka 2.0 and Confluent Platform 5.0, and back-ported&#xA;to previous versions.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Where I&#39;m speaking in the rest of 2018</title>
      <link>https://rmoff.net/2018/08/21/where-im-speaking-in-the-rest-of-2018/</link>
      <pubDate>Tue, 21 Aug 2018 21:01:00 +0000</pubDate>
      <guid>https://rmoff.net/2018/08/21/where-im-speaking-in-the-rest-of-2018/</guid>
      <description>&lt;p&gt;There&amp;rsquo;s lots going on in the next few months :-)&lt;/p&gt;&#xA;&lt;p&gt;I&amp;rsquo;m particularly excited to be speaking at several notable conferences for the first time, including JavaZone, USENIX LISA, and Devoxx.&lt;/p&gt;&#xA;&lt;p&gt;As always, if you&amp;rsquo;re nearby then hope to see you there, and let me know if you want to meet for a coffee or beer!&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;h3 id=&#34;september&#34;&gt;September&lt;/h3&gt;&#xA;&lt;h4 id=&#34;-madrid-spain&#34;&gt;🇪🇸 Madrid, Spain&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;6th Sept: &lt;a href=&#34;https://www.meetup.com/apachekafkamadrid/events/251264347/&#34;&gt;Madrid Kafka Meetup&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h4 id=&#34;-oslo-norway&#34;&gt;🇳🇴 Oslo, Norway&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;10th Sept: &lt;a href=&#34;https://www.meetup.com/Oslo-Kafka/events/&#34;&gt;Oslo Kafka Meetup&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;11th Sept: &lt;a href=&#34;https://2018.javazone.no/program/73fa52a7-661c-47a0-b4f3-55aaf5b10f6b&#34;&gt;JavaZone&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h4 id=&#34;-antwerpbrussels-belgium&#34;&gt;🇧🇪 Antwerp/Brussels, Belgium&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;25th Sept: &lt;a href=&#34;https://www.meetup.com/Brussels-Apache-Kafka-Meetup-by-Confluent/events/253855467/&#34;&gt;Brussels Kafka Meetup&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h4 id=&#34;-barcelona-spain&#34;&gt;🇪🇸 Barcelona, Spain&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;26th Sept: &lt;a href=&#34;https://www.meetup.com/Barcelona-Kafka-Meetup/events/254252957/&#34;&gt;Barcelona Kafka Meetup&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;hr&gt;&#xA;&lt;h3 id=&#34;october&#34;&gt;October&lt;/h3&gt;&#xA;&lt;h4 id=&#34;-leeds-uk&#34;&gt;🇬🇧 Leeds, UK&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;4th Oct: &lt;a href=&#34;https://www.meetup.com/Leeds-JVMThing/events/&#34;&gt;The JVM Thing meetup&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h4 id=&#34;-nashville-tn-usa&#34;&gt;🇺🇸 Nashville (TN), USA&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;31st Oct: &lt;a href=&#34;https://www.usenix.org/conference/lisa18/conference-program&#34;&gt;LISA18&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;hr&gt;&#xA;&lt;h3 id=&#34;november&#34;&gt;November&lt;/h3&gt;&#xA;&lt;h4 id=&#34;-münich-germany&#34;&gt;🇩🇪 Münich, Germany&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;7th Nov: &lt;a href=&#34;https://jax.de/&#34;&gt;W-JAX&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h4 id=&#34;-antwerp-belgium&#34;&gt;🇧🇪 Antwerp, Belgium&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;13th Nov: &lt;a href=&#34;https://devoxx.be/&#34;&gt;Devoxx Belgium&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h4 id=&#34;-krakow-poland&#34;&gt;🇵🇱 Krakow, Poland&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;26th Nov: &lt;a href=&#34;http://coredump.events/2018/&#34;&gt;CoreDump&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;hr&gt;&#xA;&lt;h3 id=&#34;december&#34;&gt;December&lt;/h3&gt;&#xA;&lt;h4 id=&#34;-liverpool-uk&#34;&gt;🇬🇧 Liverpool, UK&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;4th Dec: &lt;a href=&#34;https://www.ukougconferences.org.uk/ukoug/frontend/reg/tAgendaWebsite.csp?pageID=306&amp;amp;eventID=2&amp;amp;language=1&amp;amp;mainFramePage=dailyagenda.csp&amp;amp;mode=&#34;&gt;UKOUG TECH 18&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h4 id=&#34;-frankfurt-germany&#34;&gt;🇩🇪 Frankfurt, Germany&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;10th Dec: &lt;a href=&#34;https://www.meetup.com/Frankfurt-Apache-Kafka-Meetup-by-Confluent/events/256599175/&#34;&gt;Apache Kafka Meetup&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;11th Dec: &lt;a href=&#34;https://www.ittage.informatik-aktuell.de/&#34;&gt;IT Days&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>Kafka Listeners - Explained</title>
      <link>https://rmoff.net/2018/08/02/kafka-listeners-explained/</link>
      <pubDate>Thu, 02 Aug 2018 19:38:00 +0000</pubDate>
      <guid>https://rmoff.net/2018/08/02/kafka-listeners-explained/</guid>
      <description>&lt;p&gt;&lt;em&gt;(This was cross-posted on the &lt;a href=&#34;https://www.confluent.io/blog/kafka-listeners-explained&#34;&gt;Confluent.io blog&lt;/a&gt;)&lt;/em&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;This question comes up on StackOverflow and such places a &lt;strong&gt;lot&lt;/strong&gt;, so here&amp;rsquo;s something to try and help.&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;tl;dr&lt;/strong&gt; : You need to set &lt;code&gt;advertised.listeners&lt;/code&gt; (or &lt;code&gt;KAFKA_ADVERTISED_LISTENERS&lt;/code&gt; if you&amp;rsquo;re using Docker images) to the external address (host/IP) so that clients can correctly connect to it. Otherwise they&amp;rsquo;ll try to connect to the internal host address–and if that&amp;rsquo;s not reachable then problems ensue.&lt;/p&gt;&#xA;&lt;p&gt;Put another way, courtesy of Spencer Ruport:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;code&gt;LISTENERS&lt;/code&gt; are what interfaces Kafka binds to. &lt;code&gt;ADVERTISED_LISTENERS&lt;/code&gt;  are how clients can connect.&lt;/p&gt;&#xA;&lt;/blockquote&gt;</description>
    </item>
    <item>
      <title>Syntax highlighting code for presentation slides</title>
      <link>https://rmoff.net/2018/06/20/syntax-highlighting-code-for-presentation-slides/</link>
      <pubDate>Wed, 20 Jun 2018 18:32:10 +0000</pubDate>
      <guid>https://rmoff.net/2018/06/20/syntax-highlighting-code-for-presentation-slides/</guid>
      <description>&lt;p&gt;So you&amp;rsquo;ve got a code sample you want to share in a presentation, but whilst it looks beautiful in your text-editor with syntax highlighting, it&amp;rsquo;s fugly in Keynote? You could screenshot it and paste the image into your slide, but you just know that you&amp;rsquo;ll want to change that code, and end up re-snapshotting it…what a PITA.&lt;/p&gt;&#xA;&lt;p&gt;Better to have a nicely syntax-highlighted code snippet that you can paste as formatted text into Keynote and amend from there as needed. Here&amp;rsquo;s how.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Analysing Network Data with Apache Kafka, KSQL, and Elasticsearch</title>
      <link>https://rmoff.net/2018/06/17/analysing-network-data-with-apache-kafka-ksql-and-elasticsearch/</link>
      <pubDate>Sun, 17 Jun 2018 11:35:20 +0000</pubDate>
      <guid>https://rmoff.net/2018/06/17/analysing-network-data-with-apache-kafka-ksql-and-elasticsearch/</guid>
      <description>&lt;p&gt;In &lt;a href=&#34;http://cnfl.io/syslogs-filtering&#34;&gt;this article&lt;/a&gt; I demonstrated how to use KSQL to filter streams of network event data. As well as filtering, KSQL can be used to easily &lt;a href=&#34;https://www.confluent.io/blog/real-time-syslog-processing-apache-kafka-ksql-enriching-events-with-external-data/&#34;&gt;enrich streams&lt;/a&gt;. In this article we&amp;rsquo;ll see how this enriched data can be used to drive analysis in Elasticsearch and Kibana—and how KSQL again came into use for building some stream processing as a result of the discovery made.&lt;/p&gt;&#xA;&lt;p&gt;The data came from my home &lt;a href=&#34;https://www.ubnt.com/&#34;&gt;Ubiquiti&lt;/a&gt; router, and took two forms:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Compare and apply a diff / patch recursively</title>
      <link>https://rmoff.net/2018/06/07/compare-and-apply-a-diff-/-patch-recursively/</link>
      <pubDate>Thu, 07 Jun 2018 14:35:36 +0000</pubDate>
      <guid>https://rmoff.net/2018/06/07/compare-and-apply-a-diff-/-patch-recursively/</guid>
      <description>&lt;p&gt;Hacky way to keep config files in sync when there&amp;rsquo;s a new version of some software.&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;Caveat : probably completely wrong, may not pick up config entries added in the new version, etc. But, &lt;em&gt;works for me right here right now&lt;/em&gt; ;-)&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;p&gt;So let&amp;rsquo;s say we have two folders:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;confluent-4.1.0&#xA;confluent-4.1.1&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;Same structures, different versions. 4.1.0 was set up with our local config in &lt;code&gt;./etc&lt;/code&gt;, that we want to preserve. We can use &lt;code&gt;diff&lt;/code&gt; to easily see what&amp;rsquo;s changed:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Kafka Connect and Oracle data types</title>
      <link>https://rmoff.net/2018/05/21/kafka-connect-and-oracle-data-types/</link>
      <pubDate>Mon, 21 May 2018 08:59:00 +0000</pubDate>
      <guid>https://rmoff.net/2018/05/21/kafka-connect-and-oracle-data-types/</guid>
      <description>&lt;p&gt;The &lt;a href=&#34;https://docs.confluent.io/current/connect/connect-jdbc/docs/source_connector.html&#34;&gt;Kafka Connect JDBC Connector&lt;/a&gt; by default does not cope so well with:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;code&gt;NUMBER&lt;/code&gt; columns with no defined precision/scale. You may end up with apparent junk (&lt;code&gt;bytes&lt;/code&gt;) in the output, or just errors.&lt;/li&gt;&#xA;&lt;li&gt;&lt;code&gt;TIMESTAMP WITH LOCAL TIME ZONE&lt;/code&gt;. Throws &lt;code&gt;JDBC type -102 not currently supported&lt;/code&gt; warning in the log.&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;Read more about &lt;code&gt;NUMBER&lt;/code&gt; data type in the &lt;a href=&#34;https://docs.oracle.com/database/121/SQLRF/sql_elements001.htm#SQLRF002220&#34;&gt;Oracle docs&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;h3 id=&#34;tldr--how-do-i-make-it-work&#34;&gt;tl;dr : How do I make it work?&lt;/h3&gt;&#xA;&lt;p&gt;There are several options:&lt;/p&gt;&#xA;&lt;h4 id=&#34;new-in-confluent-platform-411--numericmapping&#34;&gt;New in Confluent Platform 4.1.1 : &lt;code&gt;numeric.mapping&lt;/code&gt;&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;In the connector configuration, set &lt;code&gt;&amp;quot;numeric.mapping&amp;quot;:&amp;quot;best_fit&amp;quot;&lt;/code&gt;&lt;/li&gt;&#xA;&lt;li&gt;New in Confluent Platform 4.1.1 (&lt;a href=&#34;https://docs.confluent.io/current/connect/connect-jdbc/docs/source_config_options.html#database&#34;&gt;Doc&lt;/a&gt;)&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h4 id=&#34;avoid-the-problem-in-the-first-place&#34;&gt;Avoid the problem in the first place&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Change the DDL of the source object. For example:&#xA;&lt;ul&gt;&#xA;&lt;li&gt;refine the &lt;code&gt;NUMBER&lt;/code&gt; &amp;rsquo;s precision and scale&lt;/li&gt;&#xA;&lt;li&gt;Use a &lt;code&gt;TIMESTAMP&lt;/code&gt; type that is supported&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h4 id=&#34;cast-the-datatypes-in-the-query&#34;&gt;CAST the datatypes in the &lt;code&gt;query&lt;/code&gt;&lt;/h4&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Pull from the object directly, and use &lt;code&gt;query&lt;/code&gt; in the JDBC connector (instead of &lt;code&gt;table.whitelist&lt;/code&gt;)—and cast the columns appropriately:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Stream-Table Joins in KSQL: Stream events must be timestamped after the Table messages</title>
      <link>https://rmoff.net/2018/05/17/stream-table-joins-in-ksql-stream-events-must-be-timestamped-after-the-table-messages/</link>
      <pubDate>Thu, 17 May 2018 10:16:43 +0000</pubDate>
      <guid>https://rmoff.net/2018/05/17/stream-table-joins-in-ksql-stream-events-must-be-timestamped-after-the-table-messages/</guid>
      <description>&lt;p&gt;(preserving &lt;a href=&#34;https://stackoverflow.com/questions/50371518/kafka-ksql-simple-join-does-not-work/50390022#50390022&#34;&gt;this StackOverflow&lt;/a&gt; answer for posterity and future Googlers)&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;tl;dr&lt;/strong&gt; When doing a stream-table join, your &lt;em&gt;table&lt;/em&gt; messages must already exist (and must be timestamped) &lt;em&gt;before&lt;/em&gt; the stream messages. If you re-emit your source stream messages, after the table topic is populated, the join will succeed.&lt;/p&gt;&#xA;&lt;h3 id=&#34;example-data&#34;&gt;Example data&lt;/h3&gt;&#xA;&lt;p&gt;Use &lt;code&gt;kafakcat&lt;/code&gt; to populate topics:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;kafkacat -b localhost:9092 -P -t sessionDetails &amp;lt;&amp;lt;EOF&#xA;{&amp;quot;Media&amp;quot;:&amp;quot;Foo&amp;quot;,&amp;quot;SessionIdTime&amp;quot;:&amp;quot;2018-05-17 11:25:33 BST&amp;quot;,&amp;quot;SessionIdSeq&amp;quot;:1}&#xA;{&amp;quot;Media&amp;quot;:&amp;quot;Foo&amp;quot;,&amp;quot;SessionIdTime&amp;quot;:&amp;quot;2018-05-17 11:26:33 BST&amp;quot;,&amp;quot;SessionIdSeq&amp;quot;:2}&#xA;EOF&#xA;&#xA;kafkacat -b localhost:9092 -P -t voipDetails &amp;lt;&amp;lt;EOF&#xA;{&amp;quot;SessionIdTime&amp;quot;:&amp;quot;2018-05-17 11:25:33 BST&amp;quot;,&amp;quot;SessionIdSeq&amp;quot;:1,&amp;quot;Details&amp;quot;:&amp;quot;Bar1a&amp;quot;}&#xA;{&amp;quot;SessionIdTime&amp;quot;:&amp;quot;2018-05-17 11:25:33 BST&amp;quot;,&amp;quot;SessionIdSeq&amp;quot;:1,&amp;quot;Details&amp;quot;:&amp;quot;Bar1b&amp;quot;}&#xA;{&amp;quot;SessionIdTime&amp;quot;:&amp;quot;2018-05-17 11:26:33 BST&amp;quot;,&amp;quot;SessionIdSeq&amp;quot;:2,&amp;quot;Details&amp;quot;:&amp;quot;Bar2&amp;quot;}&#xA;EOF&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;Validate topic contents:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Quick &#39;n Easy Population of Realistic Test Data into Kafka</title>
      <link>https://rmoff.net/2018/05/10/quick-n-easy-population-of-realistic-test-data-into-kafka/</link>
      <pubDate>Thu, 10 May 2018 12:56:00 +0000</pubDate>
      <guid>https://rmoff.net/2018/05/10/quick-n-easy-population-of-realistic-test-data-into-kafka/</guid>
      <description>&lt;p&gt;&lt;strong&gt;tl;dr&lt;/strong&gt; Use &lt;code&gt;curl&lt;/code&gt; to pull data from the Mockaroo REST endpoint, and pipe it into &lt;code&gt;kafkacat&lt;/code&gt;, thus:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;curl -s &amp;quot;https://api.mockaroo.com/api/d5a195e0?count=2&amp;amp;key=ff7856d0&amp;quot;| \&#xA;kafkacat -b localhost:9092 -t purchases -P&#xA;&lt;/code&gt;&lt;/pre&gt;</description>
    </item>
    <item>
      <title>Blogging v2</title>
      <link>https://rmoff.net/2018/05/09/blogging-v2/</link>
      <pubDate>Wed, 09 May 2018 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2018/05/09/blogging-v2/</guid>
      <description>&lt;p&gt;So the last post here was in 2011…seven years later I should probably post again, just to point random Google visitors to :&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;My new blog : &lt;a href=&#34;https://rmoff.net&#34;&gt;https://rmoff.net&lt;/a&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;em&gt;(Wordpress is icky; Ghost FTW!)&lt;/em&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;My employer&amp;rsquo;s blog on which I also write: &lt;a href=&#34;http://cnfl.io/rmoff&#34;&gt;http://cnfl.io/rmoff&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>Streaming Data from MongoDB into Kafka with Kafka Connect and Debezium</title>
      <link>https://rmoff.net/2018/03/27/streaming-data-from-mongodb-into-kafka-with-kafka-connect-and-debezium/</link>
      <pubDate>Tue, 27 Mar 2018 18:52:00 +0000</pubDate>
      <guid>https://rmoff.net/2018/03/27/streaming-data-from-mongodb-into-kafka-with-kafka-connect-and-debezium/</guid>
      <description>&lt;p&gt;&lt;em&gt;Disclaimer: I am not a MongoDB person. These steps may or may not be appropriate and proper. But they worked for me :) Feel free to post in comments if I&amp;rsquo;m doing something wrong&lt;/em&gt;&lt;/p&gt;&#xA;&lt;h3 id=&#34;mongodb-config---enabling-replica-sets&#34;&gt;MongoDB config - enabling replica sets&lt;/h3&gt;&#xA;&lt;p&gt;For Debezium to be able to stream changes from MongoDB, Mongo needs to have replication configured:&lt;/p&gt;&#xA;&lt;p&gt;Docs: &lt;a href=&#34;https://docs.mongodb.com/manual/replication/&#34;&gt;Replication&lt;/a&gt; / &lt;a href=&#34;https://docs.mongodb.com/manual/tutorial/convert-standalone-to-replica-set/&#34;&gt;Convert a Standalone to a Replica Set&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;Stop Mongo:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;rmoff@proxmox01 ~&amp;gt; sudo service mongod stop&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Add replica set config to &lt;code&gt;/etc/mongod.conf&lt;/code&gt;:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Cloning Ubiquiti&#39;s MongoDB instance to a separate server</title>
      <link>https://rmoff.net/2018/03/27/cloning-ubiquitis-mongodb-instance-to-a-separate-server/</link>
      <pubDate>Tue, 27 Mar 2018 18:45:20 +0000</pubDate>
      <guid>https://rmoff.net/2018/03/27/cloning-ubiquitis-mongodb-instance-to-a-separate-server/</guid>
      <description>&lt;p&gt;DISCLAIMER: I am not a MongoDB person (even if it is &lt;a href=&#34;http://www.mongodb-is-web-scale.com/&#34;&gt;Web Scale&lt;/a&gt; X-D) - below instructions may work for you, they may not. Use with care!&lt;/p&gt;&#xA;&lt;p&gt;For some work I&amp;rsquo;ve been doing I wanted to access the data in Ubiquiti&amp;rsquo;s Unifi controller which it stores in MongoDB. Because I didn&amp;rsquo;t want to risk my actual Unifi device by changing local settings to enable remote access, and also because the version of MongoDB on it is older than ideal, I wanted to clone the data elsewhere. This article shows you how.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Streaming Data from MySQL into Kafka with Kafka Connect and Debezium</title>
      <link>https://rmoff.net/2018/03/24/streaming-data-from-mysql-into-kafka-with-kafka-connect-and-debezium/</link>
      <pubDate>Sat, 24 Mar 2018 14:58:14 +0000</pubDate>
      <guid>https://rmoff.net/2018/03/24/streaming-data-from-mysql-into-kafka-with-kafka-connect-and-debezium/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;http://debezium.io/&#34;&gt;Debezium&lt;/a&gt; is a CDC tool that can stream changes from MySQL, MongoDB, and PostgreSQL into Kafka, using Kafka Connect. In this article we&amp;rsquo;ll see how to set it up and examine the format of the data. A subsequent article will show using this realtime stream of data from a RDBMS and join it to data originating from other sources, using KSQL.&lt;/p&gt;&#xA;&lt;p&gt;The software versions used here are:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Confluent Platform 4.0&lt;/li&gt;&#xA;&lt;li&gt;Debezium 0.7.2&lt;/li&gt;&#xA;&lt;li&gt;MySQL 5.7.19 with &lt;a href=&#34;https://dev.mysql.com/doc/sakila/en/sakila-installation.html&#34;&gt;Sakila sample database&lt;/a&gt; installed&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h3 id=&#34;install-debezium&#34;&gt;Install Debezium&lt;/h3&gt;&#xA;&lt;p&gt;To use it, you need the relevant JAR for the source system (e.g. MySQL), and make that JAR available to Kafka Connect. Here we&amp;rsquo;ll set it up for MySQL.&lt;/p&gt;</description>
    </item>
    <item>
      <title>KSQL: Topic … does not conform to the requirements</title>
      <link>https://rmoff.net/2018/03/06/ksql-topic-does-not-conform-to-the-requirements/</link>
      <pubDate>Tue, 06 Mar 2018 23:08:11 +0000</pubDate>
      <guid>https://rmoff.net/2018/03/06/ksql-topic-does-not-conform-to-the-requirements/</guid>
      <description>&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;io.confluent.ksql.exception.KafkaTopicException: Topic &amp;#39;KSQL_NOTIFY&amp;#39; does not conform to the requirements Partitions:1 v 4. Replication: 1 v 1&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Why? Because the topic KSQL creates to underpin a &lt;code&gt;CREATE STREAM AS SELECT&lt;/code&gt; or &lt;code&gt;CREATE TABLE AS SELECT&lt;/code&gt; already exists, and doesn&amp;rsquo;t match what it expects. By default it will create partitions &amp;amp; replicas based on the same values of the input topic.&lt;/p&gt;&#xA;&lt;p&gt;Options:&lt;/p&gt;&#xA;&lt;ol&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Use a different topic, via the &lt;code&gt;WITH (KAFKA_TOPIC=&#39;FOO&#39;)&lt;/code&gt; syntax, e.g.&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt; CREATE STREAM TEST WITH (KAFKA_TOPIC=&#39;FOO&#39;) AS SELECT * FROM BAR;&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Tell KSQL to use values that match the existing topic, with the &lt;code&gt;PARTITIONS&lt;/code&gt; and &lt;code&gt;REPLICAS&lt;/code&gt; parameters. So if the existing topic only has one partition, then tell KSQL that&amp;rsquo;s what you want:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Streaming data from Kafka into Elasticsearch</title>
      <link>https://rmoff.net/2018/03/06/streaming-data-from-kafka-into-elasticsearch/</link>
      <pubDate>Tue, 06 Mar 2018 22:21:00 +0000</pubDate>
      <guid>https://rmoff.net/2018/03/06/streaming-data-from-kafka-into-elasticsearch/</guid>
      <description>&lt;p&gt;&lt;em&gt;This article is part of a series exploring Streaming ETL in practice. You can read about &lt;a href=&#34;https://rmoff.net/2018/02/01/howto-oracle-goldengate--apache-kafka--schema-registry--swingbench/&#34;&gt;setting up the ingest of realtime events from a standard Oracle platform&lt;/a&gt;, and &lt;a href=&#34;https://www.confluent.io/blog/ksql-in-action-real-time-streaming-etl-from-oracle-transactional-data&#34;&gt;building streaming ETL using KSQL&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;This post shows how we take data streaming in from an Oracle transactional system into Kafka, and simply stream it onwards into Elasticsearch. This is a common pattern, for enabling rapid search or analytics against data held in systems elsewhere.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Installing the Python Kafka library from Confluent - troubleshooting some silly errors…</title>
      <link>https://rmoff.net/2018/03/06/installing-the-python-kafka-library-from-confluent-troubleshooting-some-silly-errors/</link>
      <pubDate>Tue, 06 Mar 2018 22:18:24 +0000</pubDate>
      <guid>https://rmoff.net/2018/03/06/installing-the-python-kafka-library-from-confluent-troubleshooting-some-silly-errors/</guid>
      <description>&lt;p&gt;System:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;rmoff@proxmox01:~$ uname -a&#xA;Linux proxmox01 4.4.6-1-pve #1 SMP Thu Apr 21 11:25:40 CEST 2016 x86_64 GNU/Linux&#xA;&#xA;rmoff@proxmox01:~$ head -n1 /etc/os-release&#xA;PRETTY_NAME=&amp;#34;Debian GNU/Linux 8 (jessie)&amp;#34;&#xA;&#xA;rmoff@proxmox01:~$ python --version&#xA;Python 2.7.9&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Following:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://www.confluent.io/blog/introduction-to-apache-kafka-for-python-programmers/&#34;&gt;https://www.confluent.io/blog/introduction-to-apache-kafka-for-python-programmers/&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://github.com/confluentinc/confluent-kafka-python&#34;&gt;https://github.com/confluentinc/confluent-kafka-python&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;Install &lt;code&gt;librdkafka&lt;/code&gt;, which is a pre-req for the Python library:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;wget -qO - https://packages.confluent.io/deb/4.0/archive.key | sudo apt-key add -&#xA;sudo add-apt-repository &amp;quot;deb [arch=amd64] https://packages.confluent.io/deb/4.0 stable main&amp;quot;&#xA;sudo apt-get install librdkafka-dev python-dev&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;Setup virtualenv:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;sudo apt-get install virtualenv&#xA;virtualenv kafka_push_notify&#xA;source ./kafka_push_notify/bin/activate.fish&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;Try to install &lt;code&gt;confluent-kafka&lt;/code&gt;:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Why Do We Need Streaming ETL?</title>
      <link>https://rmoff.net/2018/03/06/why-do-we-need-streaming-etl/</link>
      <pubDate>Tue, 06 Mar 2018 22:18:00 +0000</pubDate>
      <guid>https://rmoff.net/2018/03/06/why-do-we-need-streaming-etl/</guid>
      <description>&lt;p&gt;&lt;em&gt;(This is an expanded version of the intro to an article I posted over on the &lt;a href=&#34;https://www.confluent.io/blog/ksql-in-action-real-time-streaming-etl-from-oracle-transactional-data&#34;&gt;Confluent blog&lt;/a&gt;. Here I get to be as verbose as I like &lt;code&gt;;)&lt;/code&gt;)&lt;/em&gt;&lt;/p&gt;&#xA;&lt;p&gt;My first job from university was building a datawarehouse for a retailer in the UK. Back then, it was writing COBOL jobs to load tables in DB2. We waited for all the shops to close and do their end of day system processing, and send their data back to the central mainframe. From there it was checked and loaded, and then reports generated on it. This was nearly twenty years ago as my greying beard will attest—and not a lot has changed in the large majority of reporting and analytics systems since then. COBOL is maybe less common, but what has remained constant is the batch-driven nature of processing. Sometimes batches are run more frequently, and get given fancy names like intra-day ETL or even micro-batching. But batch processing it is, and as such latency is built into our reporting &lt;em&gt;by design&lt;/em&gt;. When we opt for batch processing we voluntarily inject delays into the availability of data to our end users. Much better is to build our systems around a streaming platform instead.&lt;/p&gt;</description>
    </item>
    <item>
      <title>HOWTO: Oracle GoldenGate &#43; Apache Kafka &#43; Schema Registry &#43; Swingbench</title>
      <link>https://rmoff.net/2018/02/01/howto-oracle-goldengate--apache-kafka--schema-registry--swingbench/</link>
      <pubDate>Thu, 01 Feb 2018 23:15:00 +0000</pubDate>
      <guid>https://rmoff.net/2018/02/01/howto-oracle-goldengate--apache-kafka--schema-registry--swingbench/</guid>
      <description>&lt;p&gt;&lt;em&gt;This is the detailed step-by-step if you want to recreate the process I describe in the &lt;a href=&#34;https://www.confluent.io/blog/ksql-in-action-real-time-streaming-etl-from-oracle-transactional-data&#34;&gt;Confluent blog here&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;I used Oracle&amp;rsquo;s &lt;a href=&#34;http://www.oracle.com/technetwork/database/enterprise-edition/databaseappdev-vm-161299.html&#34;&gt;Oracle Developer Days VM&lt;/a&gt;, which comes preinstalled with Oracle 12cR2. You can see the notes on &lt;a href=&#34;https://rmoff.net/2017/11/21/installing-oracle-goldengate-for-big-data-12.3.1-with-kafka-connect-and-confluent-platform/&#34;&gt;how to do this here&lt;/a&gt;. These notes take you through installing and configuring:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Swingbench, to create a sample &amp;ldquo;Order Entry&amp;rdquo; schema and simulate events on the Oracle database&lt;/li&gt;&#xA;&lt;li&gt;Oracle GoldenGate (OGG, forthwith) and Oracle GoldenGate for Big Data (OGG-BD, forthwith)&#xA;&lt;ul&gt;&#xA;&lt;li&gt;I&amp;rsquo;m using Oracle GoldenGate 12.3.1 which includes the Kafka Connect handler as part of its distribution. A connector for earlier versions can be &lt;a href=&#34;http://www.oracle.com/technetwork/middleware/goldengate/oracle-goldengate-exchange-3805527.html&#34;&gt;found here&lt;/a&gt;. Some of the syntax may differ in the configuration below - if you hit problems then check out &lt;a href=&#34;&#34;&gt;an article that I wrote&lt;/a&gt; with an earlier version of the tool.&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;OGG &lt;code&gt;extract&lt;/code&gt; from the Order Entry schema&lt;/li&gt;&#xA;&lt;li&gt;Confluent Platform&lt;/li&gt;&#xA;&lt;li&gt;KSQL&lt;/li&gt;&#xA;&lt;li&gt;Elasticsearch&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;From this point, I&amp;rsquo;ll now walk through configuring OGG-BD with the Kafka Connect handler&lt;/p&gt;</description>
    </item>
    <item>
      <title>Kafka - AdminClient - Connection to node -1 could not be established. Broker may not be available</title>
      <link>https://rmoff.net/2018/01/03/kafka-adminclient-connection-to-node-1-could-not-be-established.-broker-may-not-be-available/</link>
      <pubDate>Wed, 03 Jan 2018 11:26:00 +0000</pubDate>
      <guid>https://rmoff.net/2018/01/03/kafka-adminclient-connection-to-node-1-could-not-be-established.-broker-may-not-be-available/</guid>
      <description>&lt;hr&gt;&#xA;&lt;p&gt;&lt;strong&gt;See also &lt;a href=&#34;https://rmoff.net/2018/08/02/kafka-listeners-explained/&#34;&gt;Kafka Listeners - Explained&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;A short post to help Googlers. On a single-node sandbox Apache Kafka / Confluent Platform installation, I was getting this error from Schema Registry, Connect, etc:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;WARN [AdminClient clientId=adminclient-3] Connection to node -1 could not be established. Broker may not be available. (org.apache.kafka.clients.NetworkClient)&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;KSQL was throwing a similar error:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;KSQL cannot initialize AdminCLient.&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;I had correctly set the machine&amp;rsquo;s hostname in my Kafka &lt;code&gt;server.properties&lt;/code&gt;:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Installing Oracle GoldenGate for Big Data 12.3.1 with Kafka Connect and Confluent Platform</title>
      <link>https://rmoff.net/2017/11/21/installing-oracle-goldengate-for-big-data-12.3.1-with-kafka-connect-and-confluent-platform/</link>
      <pubDate>Tue, 21 Nov 2017 17:31:00 +0000</pubDate>
      <guid>https://rmoff.net/2017/11/21/installing-oracle-goldengate-for-big-data-12.3.1-with-kafka-connect-and-confluent-platform/</guid>
      <description>&lt;p&gt;&lt;em&gt;Some notes that I made on installing and configuring Oracle GoldenGate with Confluent Platform. Excuse the brevity, but hopefully useful to share!&lt;/em&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;I used the &lt;a href=&#34;http://www.oracle.com/technetwork/database/enterprise-edition/databaseappdev-vm-161299.html&#34;&gt;Oracle Developer Days VM&lt;/a&gt; for this - it&amp;rsquo;s preinstalled with Oracle 12cR2. &lt;a href=&#34;http://www.oracle.com/technetwork/database/bigdata-appliance/oracle-bigdatalite-2104726.html&#34;&gt;Big Data Lite&lt;/a&gt; is nice but currently has an older version of GoldenGate.&lt;/p&gt;&#xA;&lt;p&gt;Login to the VM (oracle/oracle) and then install some useful things:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;sudo rpm -Uvh https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm&#xA;sudo yum install -y screen htop collectl rlwrap p7zip unzip sysstat perf iotop&#xA;sudo su -&#xA;cd /etc/yum.repos.d/&#xA;wget http://download.opensuse.org/repositories/shells:fish:release:2/CentOS_7/shells:fish:release:2.repo&#xA;yum install fish&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;Check Oracle version etc:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Where will I be at OpenWorld / Oak Table World?</title>
      <link>https://rmoff.net/2017/09/29/where-will-i-be-at-openworld-/-oak-table-world/</link>
      <pubDate>Fri, 29 Sep 2017 19:02:55 +0000</pubDate>
      <guid>https://rmoff.net/2017/09/29/where-will-i-be-at-openworld-/-oak-table-world/</guid>
      <description>&lt;p&gt;Here&amp;rsquo;s where I&amp;rsquo;ll be!&lt;/p&gt;&#xA;&lt;iframe src=&#34;https://calendar.google.com/calendar/embed?title=rmoff%20%40%20OOW17%2FOTW17&amp;amp;showNav=0&amp;amp;showDate=0&amp;amp;showPrint=0&amp;amp;showTabs=0&amp;amp;showCalendars=0&amp;amp;showTz=0&amp;amp;mode=AGENDA&amp;amp;height=600&amp;amp;wkst=1&amp;amp;bgcolor=%23FFFFFF&amp;amp;src=confluent.io_0bq6fa55a27pqun24uec7jm8sk%40group.calendar.google.com&amp;amp;color=%23B1365F&amp;amp;ctz=America%2FLos_Angeles&#34; style=&#34;border-width:0&#34; width=&#34;800&#34; height=&#34;600&#34; frameborder=&#34;0&#34; scrolling=&#34;no&#34;&gt;&lt;/iframe&gt;&#xA;&lt;p&gt;If you use Google Calendar you can click on individual entries above and select &lt;code&gt;copy to my calendar&lt;/code&gt; - which of course you&amp;rsquo;ll want to do for all the ones I&amp;rsquo;ve marked as &lt;code&gt;[SPEAKING]&lt;/code&gt; :-)&lt;/p&gt;&#xA;&lt;p&gt;Here&amp;rsquo;s a list of all the &lt;a href=&#34;https://rmoff.net/2017/09/20/apache-kafka-talks-at-oracle-openworld-javaone-and-oak-table-world-2017/&#34;&gt;Apache Kafka talks at OpenWorld and JavaOne&lt;/a&gt;, most of which I&amp;rsquo;ll be trying to get to.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Apache Kafka™ talks at Oracle OpenWorld, JavaOne, and Oak Table World 2017</title>
      <link>https://rmoff.net/2017/09/20/apache-kafka-talks-at-oracle-openworld-javaone-and-oak-table-world-2017/</link>
      <pubDate>Wed, 20 Sep 2017 15:46:00 +0000</pubDate>
      <guid>https://rmoff.net/2017/09/20/apache-kafka-talks-at-oracle-openworld-javaone-and-oak-table-world-2017/</guid>
      <description>&lt;p&gt;There&amp;rsquo;s an impressive 19 sessions that cover Apache Kafka™ at Oracle OpenWorld, JavaOne, and Oak Table World this year! You can find the full list with speakers in the session catalogs for &lt;a href=&#34;https://events.rainfocus.com/catalog/oracle/oow17/catalogoow17?search=kafka&amp;amp;showEnrolled=false&#34;&gt;OOW&lt;/a&gt;, &lt;a href=&#34;https://events.rainfocus.com/catalog/oracle/oow17/catalogjavaone17?search=kafka&amp;amp;showEnrolled=false&#34;&gt;JavaOne&lt;/a&gt;, and &lt;a href=&#34;http://www.oaktable.net/blog/oak-table-world-2017-oracle-open-world&#34;&gt;Oak Table World&lt;/a&gt;. OTW is an awesome techie conference which is at the same time as OpenWorld, next door to Moscone. Hope to see you there!&lt;/p&gt;&#xA;&lt;p&gt;&lt;em&gt;Check out the writeup of my previous visit to OOW including useful tips &lt;a href=&#34;https://www.rittmanmead.com/blog/2014/10/first-timer-tips-for-oracle-open-world/&#34;&gt;here&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Oracle GoldenGate / Kafka Connect Handler troubleshooting</title>
      <link>https://rmoff.net/2017/09/12/oracle-goldengate-/-kafka-connect-handler-troubleshooting/</link>
      <pubDate>Tue, 12 Sep 2017 21:55:16 +0000</pubDate>
      <guid>https://rmoff.net/2017/09/12/oracle-goldengate-/-kafka-connect-handler-troubleshooting/</guid>
      <description>&lt;p&gt;The Replicat was kapput:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;GGSCI (localhost.localdomain) 3&amp;gt; info rkconnoe&#xA;&#xA;REPLICAT   RKCONNOE  Last Started 2017-09-12 17:06   Status ABENDED&#xA;Checkpoint Lag       00:00:00 (updated 00:46:34 ago)&#xA;Log Read Checkpoint  File /u01/app/ogg/dirdat/oe000000&#xA;                     First Record  RBA 0&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;So checking the OGG error log &lt;code&gt;ggserr.log&lt;/code&gt; showed&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;2017-09-12T17:06:17.572-0400  ERROR   OGG-15051  Oracle GoldenGate Delivery, rkconnoe.prm:  Java or JNI exception:&#xA;                              oracle.goldengate.util.GGException: Error detected handling operation added event.&#xA;2017-09-12T17:06:17.572-0400  ERROR   OGG-01668  Oracle GoldenGate Delivery, rkconnoe.prm:  PROCESS ABENDING.&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;So checking the replicat log &lt;code&gt;dirrpt/RKCONNOE_info_log4j.log&lt;/code&gt; showed:&lt;/p&gt;</description>
    </item>
    <item>
      <title>What is Markdown, and Why is it Awesome?</title>
      <link>https://rmoff.net/2017/09/12/what-is-markdown-and-why-is-it-awesome/</link>
      <pubDate>Tue, 12 Sep 2017 19:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2017/09/12/what-is-markdown-and-why-is-it-awesome/</guid>
      <description>&lt;p&gt;Markdown is a plain-text formatting syntax. It enables you write documents in plain text, readable by others in plain text, and optionally rendered into nicely formatted PDF, HTML, DOCX etc.&lt;/p&gt;&#xA;&lt;p&gt;It&amp;rsquo;s used widely in software documentation, particularly open-source, because it enables richer formatting than plain-text alone, but without constraining authors or readers to a given software platform.&lt;/p&gt;&#xA;&lt;p&gt;Platforms such as github natively support Markdown rendering - so you write your &lt;code&gt;README&lt;/code&gt; etc in markdown, and when viewed on github it is automagically rendered - without you needing to actually do anything.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Conferences &amp; Meetups at which I&#39;ll be speaking - 2017</title>
      <link>https://rmoff.net/2017/09/11/conferences-meetups-at-which-ill-be-speaking-2017/</link>
      <pubDate>Mon, 11 Sep 2017 06:45:00 +0000</pubDate>
      <guid>https://rmoff.net/2017/09/11/conferences-meetups-at-which-ill-be-speaking-2017/</guid>
      <description>&lt;p&gt;I&amp;rsquo;m excited to be speaking at several conferences and meetups over the next few months. Unsurprisingly, the topic will be Apache Kafka!&lt;/p&gt;&#xA;&lt;p&gt;If you&amp;rsquo;re at any of these, please do come and say hi :)&lt;/p&gt;&#xA;&lt;h3 id=&#34;apache-kafka-meetup---london&#34;&gt;Apache Kafka Meetup - London&lt;/h3&gt;&#xA;&lt;p&gt;&lt;em&gt;My first time talking at the London Apache Kafka Meetup - always a sold-out crowd, this will be fun!&lt;/em&gt;&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;September 20th, 19:00 : &lt;strong&gt;&lt;a href=&#34;https://www.meetup.com/Apache-Kafka-London/events/242981989/&#34;&gt;Look Ma, no Code! Building Streaming Data Pipelines with Apache Kafka&lt;/a&gt;&lt;/strong&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Slides are &lt;a href=&#34;https://talks.rmoff.net/kafka-summit-london-2018-look-ma-no-code-building-streaming-data-pipelines-with-apache-kafka-sd/&#34;&gt;available here&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;h3 id=&#34;oracle-openworld---san-francisco&#34;&gt;Oracle OpenWorld - San Francisco&lt;/h3&gt;&#xA;&lt;p&gt;&lt;em&gt;This will be my second time at OOW - I &lt;a href=&#34;https://www.rittmanmead.com/blog/2014/10/first-timer-tips-for-oracle-open-world/&#34;&gt;wrote up my previous trip here&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Kafka Connect - JsonDeserializer with schemas.enable requires &#34;schema&#34; and &#34;payload&#34; fields</title>
      <link>https://rmoff.net/2017/09/06/kafka-connect-jsondeserializer-with-schemas.enable-requires-schema-and-payload-fields/</link>
      <pubDate>Wed, 06 Sep 2017 12:00:25 +0000</pubDate>
      <guid>https://rmoff.net/2017/09/06/kafka-connect-jsondeserializer-with-schemas.enable-requires-schema-and-payload-fields/</guid>
      <description>&lt;p&gt;An error that I see coming up frequently in the Kafka Connect community (e.g. &lt;a href=&#34;https://groups.google.com/forum/#!forum/confluent-platform&#34;&gt;mailing list&lt;/a&gt;, &lt;a href=&#34;https://slackpass.io/confluentcommunity&#34;&gt;Slack group&lt;/a&gt;, &lt;a href=&#34;https://stackoverflow.com/questions/tagged/apache-kafka-connect&#34;&gt;StackOverflow&lt;/a&gt;) is:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;JsonDeserializer with schemas.enable requires &amp;quot;schema&amp;quot; and &amp;quot;payload&amp;quot; fields and may not contain additional fields&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;or&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;No fields found using key and value schemas for table: foo-bar&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;You can see an explanation, and solution, for the issue in my StackOverflow answer here: &lt;a href=&#34;https://stackoverflow.com/a/45940013/350613&#34;&gt;https://stackoverflow.com/a/45940013/350613&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;If you&amp;rsquo;re using &lt;code&gt;schemas.enable&lt;/code&gt; in the Connector configuration, you must have &lt;code&gt;schema&lt;/code&gt; and &lt;code&gt;payload&lt;/code&gt; as the root-level elements of your JSON message (&#xA;Which is pretty much verbatim what the error says 😁), like this:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Simple export/import of Data Sources in Grafana</title>
      <link>https://rmoff.net/2017/08/08/simple-export/import-of-data-sources-in-grafana/</link>
      <pubDate>Tue, 08 Aug 2017 19:32:00 +0000</pubDate>
      <guid>https://rmoff.net/2017/08/08/simple-export/import-of-data-sources-in-grafana/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;http://docs.grafana.org/http_api/data_source/&#34;&gt;Grafana API Reference&lt;/a&gt;&lt;/p&gt;&#xA;&lt;h3 id=&#34;export-all-grafana-data-sources-to-data_sources-folder&#34;&gt;Export all Grafana data sources to data_sources folder&lt;/h3&gt;&#xA;&lt;pre&gt;&lt;code&gt;mkdir -p data_sources &amp;amp;&amp;amp; curl -s &amp;quot;http://localhost:3000/api/datasources&amp;quot;  -u admin:admin|jq -c -M &#39;.[]&#39;|split -l 1 - data_sources/&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;This exports each data source to a separate JSON file in the &lt;code&gt;data_sources&lt;/code&gt; folder.&lt;/p&gt;&#xA;&lt;h3 id=&#34;load-data-sources-back-in-from-folder&#34;&gt;Load data sources back in from folder&lt;/h3&gt;&#xA;&lt;p&gt;This submits every file that exists in the &lt;code&gt;data_sources&lt;/code&gt; folder to Grafana as a new data source definition.&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;for i in data_sources/*; do \&#xA;&#x9;curl -X &amp;quot;POST&amp;quot; &amp;quot;http://localhost:3000/api/datasources&amp;quot; \&#xA;    -H &amp;quot;Content-Type: application/json&amp;quot; \&#xA;     --user admin:admin \&#xA;     --data-binary @$i&#xA;done&#xA;&lt;/code&gt;&lt;/pre&gt;</description>
    </item>
    <item>
      <title>Linux - USB disk connection problems - uas: probe failed with error -12</title>
      <link>https://rmoff.net/2017/06/21/linux-usb-disk-connection-problems-uas-probe-failed-with-error-12/</link>
      <pubDate>Wed, 21 Jun 2017 06:14:45 +0000</pubDate>
      <guid>https://rmoff.net/2017/06/21/linux-usb-disk-connection-problems-uas-probe-failed-with-error-12/</guid>
      <description>&lt;p&gt;Usually connecting external disks in Linux is easy. Plug it in, run &lt;code&gt;fdisk -l&lt;/code&gt; or &lt;code&gt;lsblk | grep disk&lt;/code&gt; to identify the device ID, and then &lt;code&gt;mount&lt;/code&gt; it.&lt;/p&gt;&#xA;&lt;p&gt;Unfortunately in this instance, plugging in my Seagate 2TB wasn&amp;rsquo;t so simple. The server is running Proxmox:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;# uname -a&#xA;Linux proxmox01 4.4.6-1-pve #1 SMP Thu Apr 21 11:25:40 CEST 2016 x86_64 GNU/Linux&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;No device showed up on &lt;code&gt;lsblk&lt;/code&gt; or &lt;code&gt;fdisk -l&lt;/code&gt;. In &lt;code&gt;dmesg&lt;/code&gt; I saw:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Configuring Kafka Connect to log REST HTTP messages to a separate file</title>
      <link>https://rmoff.net/2017/06/12/configuring-kafka-connect-to-log-rest-http-messages-to-a-separate-file/</link>
      <pubDate>Mon, 12 Jun 2017 15:28:15 +0000</pubDate>
      <guid>https://rmoff.net/2017/06/12/configuring-kafka-connect-to-log-rest-http-messages-to-a-separate-file/</guid>
      <description>&lt;p&gt;Kafka&amp;rsquo;s Connect API is a wondrous way of easily bringing data in and out of Apache Kafka without having to write a line of code. By choosing a Connector from &lt;a href=&#34;https://www.confluent.io/product/connectors/&#34;&gt;the many available&lt;/a&gt;, it&amp;rsquo;s possible to set up and end-to-end data pipeline with just a few lines of configuration. You can configure this by hand, or you can use the &lt;a href=&#34;https://www.confluent.io/product/control-center/&#34;&gt;Confluent Control Center&lt;/a&gt;, for both management and monitoring:&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2017/05/Control_Center.png&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;BUT &amp;hellip; there are times when not all goes well - perhaps your source has gone offline, or one of your targets has been misconfigured. What then? Well of course, it&amp;rsquo;s diagnostics time! And for diagnostics, you need logs. When you launch Kafka Connect it logs everything to &lt;code&gt;stdout&lt;/code&gt;, and this output includes content from the Kafka Connect &lt;a href=&#34;http://docs.confluent.io/current/connect/restapi.html&#34;&gt;REST interface&lt;/a&gt;. This REST interface is for configuration and control of the connectors (status/pause/resume) - and whilst Control Center is being used on the Connect configuration screens, you&amp;rsquo;ll notice that the REST interface gets polled frequently - every couple of seconds, with a greater number of requests the more connectors you have. All of this goes into the log:&lt;/p&gt;</description>
    </item>
    <item>
      <title>kafka.common.KafkaException: No key found on line 1</title>
      <link>https://rmoff.net/2017/05/12/kafka.common.kafkaexception-no-key-found-on-line-1/</link>
      <pubDate>Fri, 12 May 2017 00:52:41 +0000</pubDate>
      <guid>https://rmoff.net/2017/05/12/kafka.common.kafkaexception-no-key-found-on-line-1/</guid>
      <description>&lt;p&gt;A very silly &lt;a href=&#34;https://en.wiktionary.org/wiki/PEBCAK&#34;&gt;PEBCAK&lt;/a&gt; problem this one, but Google hits weren&amp;rsquo;t so helpful so here goes.&lt;/p&gt;&#xA;&lt;p&gt;Running a console producer, specifying keys:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;kafka-console-producer \&#xA;--broker-list localhost:9092 \&#xA;--topic test_topic \&#xA;--property parse.key=true \&#xA;--property key.seperator=,&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;Failed when I entered a key/value:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;1,foo&#xA;kafka.common.KafkaException: No key found on line 1: 1,foo&#xA;        at kafka.tools.ConsoleProducer$LineMessageReader.readMessage(ConsoleProducer.scala:314)&#xA;        at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:55)&#xA;        at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;&lt;strong&gt;kafka.common.KafkaException: No key found on line&lt;/strong&gt; &amp;hellip; but I specified the key, didn&amp;rsquo;t I?&lt;/p&gt;&#xA;&lt;p&gt;It would help if I could spell &amp;hellip;  &lt;code&gt;key.sep&lt;/code&gt;&lt;strong&gt;e&lt;/strong&gt;&lt;code&gt;rator&lt;/code&gt; isn&amp;rsquo;t a valid property to configure. &lt;code&gt;sep&lt;/code&gt;&lt;strong&gt;a&lt;/strong&gt;&lt;code&gt;rator&lt;/code&gt; on the other hand, is:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Keeping Up with the Deluge</title>
      <link>https://rmoff.net/2017/03/11/keeping-up-with-the-deluge/</link>
      <pubDate>Sat, 11 Mar 2017 15:30:00 +0000</pubDate>
      <guid>https://rmoff.net/2017/03/11/keeping-up-with-the-deluge/</guid>
      <description>&lt;p&gt;&lt;em&gt;How do you try and stay current on technical affairs, given only 24 hours in a day and a job to do as well? Here&amp;rsquo;s my take on it&amp;hellip;&lt;/em&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;One of the many things that has changed perceptibly since the beginning of this century when I started working in IT is the amount of information freely available, and being created all the time. Back then, printed books and manuals were still the primary source of definitive information about a piece of software. Remember these? I bet you still have a few of them keeping your monitor at the right height still&amp;hellip;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Install qemu on AWS EC2 Amazon Linux</title>
      <link>https://rmoff.net/2017/03/11/install-qemu-on-aws-ec2-amazon-linux/</link>
      <pubDate>Sat, 11 Mar 2017 15:04:00 +0000</pubDate>
      <guid>https://rmoff.net/2017/03/11/install-qemu-on-aws-ec2-amazon-linux/</guid>
      <description>&lt;p&gt;Mucking about with virtual disks, I wanted to install &lt;code&gt;qemu&lt;/code&gt; on a AWS EC2 instance in order to use &lt;code&gt;qemu-img&lt;/code&gt;.&lt;/p&gt;&#xA;&lt;p&gt;Not finding it in a &lt;code&gt;yum&lt;/code&gt; repo, I built it from scratch:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;$ uname -a&#xA;&#xA;Linux ip-10-0-1-238 4.4.41-36.55.amzn1.x86_64 #1 SMP Wed Jan 18 01:03:26 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;Steps:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;sudo yum install -y ghc-glib-devel ghc-glib autoconf autogen intltool libtool&#xA;&#xA;wget http://download.qemu-project.org/qemu-2.8.0.tar.xz&#xA;tar xvJf qemu-2.8.0.tar.xz&#xA;cd qemu-2.8.0&#xA;./configure&#xA;make&#xA;sudo make install&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;I hit a few errors, recorded here for passing Googlers:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Mount VMDK/OVF/OVA on Amazon Web Services (AWS) EC2</title>
      <link>https://rmoff.net/2017/03/11/mount-vmdk/ovf/ova-on-amazon-web-services-aws-ec2/</link>
      <pubDate>Sat, 11 Mar 2017 14:21:00 +0000</pubDate>
      <guid>https://rmoff.net/2017/03/11/mount-vmdk/ovf/ova-on-amazon-web-services-aws-ec2/</guid>
      <description>&lt;p&gt;So you&amp;rsquo;ve got a Linux VM that you want to access the contents of in EC2 - how do you do it? Let&amp;rsquo;s see how. First up, convert the VMDK to raw image file. If you&amp;rsquo;ve got a &lt;code&gt;ova&lt;/code&gt;/&lt;code&gt;ovf&lt;/code&gt; then just untar it first (&lt;code&gt;tar -xvf my_vm.ova&lt;/code&gt;), from which you should get the VMDK. With that, convert it using &lt;code&gt;qemu-img&lt;/code&gt;:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;$ time qemu-img convert -f vmdk -O raw SampleAppv607p-appliance-disk1.vmdk SampleAppv607p-appliance-disk1.raw&#xA;&#xA;real    16m36.740s&#xA;user    6m44.136s&#xA;sys     0m11.000s&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Inspect the image file:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Little Technology Wins</title>
      <link>https://rmoff.net/2017/03/11/little-technology-wins/</link>
      <pubDate>Sat, 11 Mar 2017 11:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2017/03/11/little-technology-wins/</guid>
      <description>&lt;h3 id=&#34;wireless-headset-for-voip-with-no-30-minute-dalek-timebomb&#34;&gt;Wireless Headset for VOIP With No 30-Minute Dalek Timebomb&lt;/h3&gt;&#xA;&lt;p&gt;A lot of my work is done remotely, with colleagues and customers. Five years ago I bought a &lt;a href=&#34;https://www.amazon.co.uk/Microsoft-JUG-00014-LifeChat-LX-3000-Headset&#34;&gt;Microsoft LifeChat LX-3000&lt;/a&gt; which plugged into the USB port on my Mac. It did the job kinda fine, with two gripes:&lt;/p&gt;&#xA;&lt;ol&gt;&#xA;&lt;li&gt;it wasn&amp;rsquo;t wireless. I like to wander whilst I chat, and I didn&amp;rsquo;t like being tethered. But this in itself wasn&amp;rsquo;t a reason to ditch it&lt;/li&gt;&#xA;&lt;li&gt;After c.30 minutes on a call, my voice would turn into a dalek. or rather, my voice wouldn&amp;rsquo;t but the audio that others heard was.&#xA;This happened regardless of platform (Hangouts / Zoom / Skype / etc). I figured it must be a software or network issue. Never got to the bottom of it, until I switched to using a Snowball microphone for some proper &lt;a href=&#34;https://www.drilltodetail.com/podcast/2016/12/20/drill-to-detail-ep14-christmas-new-year-special-with-special-guest-robin-moffatt&#34;&gt;voice recording&lt;/a&gt; - and any calls I happened to also make on it no longer had the Dalek problem.&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;p&gt;So I switched, on a colleague&amp;rsquo;s recommendation, to the &lt;a href=&#34;https://www.amazon.co.uk/Logitech-H600-Wireless-Headset-Mac/&#34;&gt;Logitech H600&lt;/a&gt;. I love it. The wireless works flawlessly, and no dalek effect. Gripes? Well there&amp;rsquo;s no pleasing some people. The audio quality is great for calls, but for music I switch back to my wireless &lt;a href=&#34;https://www.amazon.co.uk/Avantree-Comfortable-Bluetooth-Headphones-Lightweight&#34;&gt;Avantree Auditions&lt;/a&gt;. The Logitech headset also feels a bit plastic, which I don&amp;rsquo;t care about unless I&amp;rsquo;m back here in six months complaining that it&amp;rsquo;s broken&amp;hellip;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Time For a Change</title>
      <link>https://rmoff.net/2017/03/10/time-for-a-change/</link>
      <pubDate>Fri, 10 Mar 2017 17:30:00 +0000</pubDate>
      <guid>https://rmoff.net/2017/03/10/time-for-a-change/</guid>
      <description>&lt;p&gt;After 5 years at Rittman Mead, &lt;a href=&#34;https://www.rittmanmead.com/blog/author/robin-moffatt/&#34;&gt;126 blog posts&lt;/a&gt;, &lt;a href=&#34;https://talks.rmoff.net/&#34;&gt;16 conferences&lt;/a&gt;, &lt;a href=&#34;https://community.oracle.com/docs/DOC-993649&#34;&gt;four&lt;/a&gt; &lt;a href=&#34;https://community.oracle.com/docs/DOC-1010305&#34;&gt;published&lt;/a&gt; &lt;a href=&#34;https://community.oracle.com/docs/DOC-1009358&#34;&gt;OTN&lt;/a&gt; &lt;a href=&#34;https://community.oracle.com/docs/DOC-1006400&#34;&gt;articles&lt;/a&gt;, an &lt;a href=&#34;https://apex.oracle.com/pls/otn/f?p=19297:4:::NO:4:P4_ID:10100&#34;&gt;Oracle ACE award&lt;/a&gt; - not to mention, of course, a whole heap of interesting and challenging client work - I&amp;rsquo;ve decided that it&amp;rsquo;s time to do something different.&lt;/p&gt;&#xA;&lt;p&gt;Later this month I&amp;rsquo;ll be joining &lt;a href=&#34;https://confluent.io&#34;&gt;&lt;strong&gt;Confluent&lt;/strong&gt;&lt;/a&gt; as a &lt;strong&gt;Partner Technology Evangelist&lt;/strong&gt;, helping spread the good word of Apache Kafka and the &lt;a href=&#34;https://www.confluent.io/product/&#34;&gt;Confluent platform&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2017/03/66021689.jpg&#34; alt=&#34;I&amp;rsquo;m so excited!&#34;&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;As always you can find me on Twitter &lt;a href=&#34;https://twitter.com/rmoff/&#34;&gt;@rmoff&lt;/a&gt;, for beer tweets, fried breakfast pics - and lots of Apache Kafka! You can also email me direct at &lt;a href=&#34;mailto:robin@rmoff.net&#34;&gt;robin@rmoff.net&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>HBase crash after resuming suspended VM</title>
      <link>https://rmoff.net/2017/01/20/hbase-crash-after-resuming-suspended-vm/</link>
      <pubDate>Fri, 20 Jan 2017 09:36:00 +0000</pubDate>
      <guid>https://rmoff.net/2017/01/20/hbase-crash-after-resuming-suspended-vm/</guid>
      <description>&lt;p&gt;I use &lt;a href=&#34;http://www.oracle.com/technetwork/database/bigdata-appliance/oracle-bigdatalite-2104726.html&#34;&gt;BigDataLite&lt;/a&gt; for a lot of my sandboxing work. This is a &lt;code&gt;OVA&lt;/code&gt; provided by Oracle which can be run on VirtualBox, VMWare, etc and has the Cloudera Hadoop platform (CDH) along with all of Oracle&amp;rsquo;s Big Data goodies including Big Data Discovery and Big Data Spatial and Graph (BDSG).&lt;/p&gt;&#xA;&lt;p&gt;Something that kept tripping me up during my work with BDSG was that HBase would become unavailable. Not being an HBase expert and simply using it as a data store for my property graph data, I wrote it off as mistakes on my part. But, the issue kept reoccuring enough for me to dig into it.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Kibana Timelion - Anomaly Detection</title>
      <link>https://rmoff.net/2017/01/18/kibana-timelion-anomaly-detection/</link>
      <pubDate>Wed, 18 Jan 2017 19:53:10 +0000</pubDate>
      <guid>https://rmoff.net/2017/01/18/kibana-timelion-anomaly-detection/</guid>
      <description>&lt;p&gt;Using the &lt;code&gt;holt&lt;/code&gt; function in Timelion to do anomaly detection on Metricbeat data in Kibana:&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2017/01/holt_-_Timelion_-_Kibana.png&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;Expression:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;$thres=0.02, .es(index=&#39;metricbeat*&#39;,metric=&#39;max:system.cpu.user.pct&#39;).lines(1).if(eq, 0, null).holt(0.9, 0.1, 0.9, 0.5h).color(#eee).lines(10).label(&#39;Prediction&#39;), .es(index=&#39;metricbeat*&#39;,metric=&#39;max:system.cpu.user.pct&#39;).color(#666).lines(1).label(Actual), .es(index=&#39;metricbeat*&#39;,metric=&#39;max:system.cpu.user.pct&#39;).lines(1).if(eq, 0, null).holt(0.9, 0.1, 0.9, 0.5h).subtract(.es(index=&#39;metricbeat*&#39;,metric=&#39;max:system.cpu.user.pct&#39;)).abs().if(lt, $thres, null, .es(index=&#39;metricbeat*&#39;,metric=&#39;max:system.cpu.user.pct&#39;)).points(10,3,0).color(#c66).label(&#39;Anomaly&#39;).title(&#39;max:system.cpu.user.pct / @rmoff&#39;)&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;References:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://twitter.com/rashidkpc/status/762754396111327232&#34;&gt;https://twitter.com/rashidkpc/status/762754396111327232&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://github.com/elastic/timelion/issues/87&#34;&gt;https://github.com/elastic/timelion/issues/87&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://github.com/elastic/timelion/blob/master/FUNCTIONS.md&#34;&gt;https://github.com/elastic/timelion/blob/master/FUNCTIONS.md&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>Streaming / Unbounded Data - Resources</title>
      <link>https://rmoff.net/2017/01/16/streaming-/-unbounded-data-resources/</link>
      <pubDate>Mon, 16 Jan 2017 11:10:38 +0000</pubDate>
      <guid>https://rmoff.net/2017/01/16/streaming-/-unbounded-data-resources/</guid>
      <description>&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://www.oreilly.com/ideas/the-world-beyond-batch-streaming-101&#34;&gt;The world beyond batch: Streaming 101&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://www.oreilly.com/ideas/the-world-beyond-batch-streaming-102&#34;&gt;The world beyond batch: Streaming 102&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://www.oreilly.com/ideas/data-architectures-for-streaming-applications&#34;&gt;Data architectures for streaming applications&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://www.se-radio.net/2016/10/se-radio-episode-272-frances-perry-on-apache-beam/&#34;&gt;SE-Radio Episode 272: Frances Perry on Apache Beam&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;(&lt;a href=&#34;https://unsplash.com/@jacksonjost%5D&#34;&gt;img credit&lt;/a&gt;)&lt;/p&gt;</description>
    </item>
    <item>
      <title>kafka-avro-console-producer - Error registering Avro schema / io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException</title>
      <link>https://rmoff.net/2016/12/02/kafka-avro-console-producer-error-registering-avro-schema-/-io.confluent.kafka.schemaregistry.client.rest.exceptions.restclientexception/</link>
      <pubDate>Fri, 02 Dec 2016 11:35:57 +0000</pubDate>
      <guid>https://rmoff.net/2016/12/02/kafka-avro-console-producer-error-registering-avro-schema-/-io.confluent.kafka.schemaregistry.client.rest.exceptions.restclientexception/</guid>
      <description>&lt;p&gt;By default, the &lt;code&gt;kafka-avro-console-producer&lt;/code&gt; will assume that the schema registry is on port 8081, and happily connect to it. Unfortunately, this can lead to some weird errors if another process happens to be listening on port 8081 &lt;em&gt;already&lt;/em&gt;!&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;[oracle@bigdatalite tmp]$ kafka-avro-console-producer \&#xA;&amp;gt;  --broker-list localhost:9092 --topic kudu_test \&#xA;&amp;gt;  --property value.schema=&amp;#39;{&amp;#34;type&amp;#34;:&amp;#34;record&amp;#34;,&amp;#34;name&amp;#34;:&amp;#34;myrecord&amp;#34;,&amp;#34;fields&amp;#34;:[{&amp;#34;name&amp;#34;:&amp;#34;id&amp;#34;,&amp;#34;type&amp;#34;:&amp;#34;int&amp;#34;},{&amp;#34;name&amp;#34;:&amp;#34;random_field&amp;#34;, &amp;#34;type&amp;#34;: &amp;#34;string&amp;#34;}]}&amp;#39;&#xA;&#xA;{&amp;#34;id&amp;#34;: 999, &amp;#34;random_field&amp;#34;: &amp;#34;foo&amp;#34;}&#xA;&#xA;org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: {&amp;#34;type&amp;#34;:&amp;#34;record&amp;#34;,&amp;#34;name&amp;#34;:&amp;#34;myrecord&amp;#34;,&amp;#34;fields&amp;#34;:[{&amp;#34;name&amp;#34;:&amp;#34;id&amp;#34;,&amp;#34;type&amp;#34;:&amp;#34;int&amp;#34;},{&amp;#34;name&amp;#34;:&amp;#34;random_field&amp;#34;,&amp;#34;type&amp;#34;:&amp;#34;string&amp;#34;}]}&#xA;Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unexpected character (&amp;#39;&amp;lt;&amp;#39; (code 60)): expected a valid value (number, String, array, object, &amp;#39;true&amp;#39;, &amp;#39;false&amp;#39; or &amp;#39;null&amp;#39;)&#xA; at [Source: sun.net.www.protocol.http.HttpURLConnection$HttpInputStream@4e0ae11f; line: 1, column: 2]; error code: 50005&#xA;        at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:170)&#xA;        at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:187)&#xA;        at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:238)&#xA;        at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:230)&#xA;        at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:225)&#xA;        at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:59)&#xA;        at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:91)&#xA;        at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:72)&#xA;        at io.confluent.kafka.formatter.AvroMessageReader.readMessage(AvroMessageReader.java:158)&#xA;        at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:55)&#xA;        at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Solution? Make sure you specify the schema URL when you launch the producer, using &lt;code&gt;--property schema.registry.url=http://localhost:18081&lt;/code&gt; :&lt;/p&gt;</description>
    </item>
    <item>
      <title>Oracle GoldenGate -&gt; Kafka Connect - &#34;Failed to serialize Avro data&#34;</title>
      <link>https://rmoff.net/2016/11/29/oracle-goldengate-kafka-connect-failed-to-serialize-avro-data/</link>
      <pubDate>Tue, 29 Nov 2016 22:04:38 +0000</pubDate>
      <guid>https://rmoff.net/2016/11/29/oracle-goldengate-kafka-connect-failed-to-serialize-avro-data/</guid>
      <description>&lt;p&gt;&lt;strong&gt;tl;dr&lt;/strong&gt; &lt;em&gt;Make sure that &lt;code&gt;key.converter.schema.registry.url&lt;/code&gt; and &lt;code&gt;value.converter.schema.registry.url&lt;/code&gt; are specified, and that there are no trailing whitespaces.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;I&amp;rsquo;ve been building on &lt;a href=&#34;https://www.confluent.io/blog/streaming-data-oracle-using-oracle-goldengate-kafka-connect/&#34;&gt;previous work&lt;/a&gt; I&amp;rsquo;ve done with Oracle GoldenGate and Kafka Connect, looking at how to have the change records from the Oracle database come through to Kafka in Avro format rather than the default JSON that the &lt;a href=&#34;https://java.net/projects/oracledi/downloads/directory/GoldenGate/Oracle%20GoldenGate%20Adapter%20for%20Kafka%20Connect&#34;&gt;sample configuration&lt;/a&gt; gives.&lt;/p&gt;&#xA;&lt;p&gt;Simply changing the Kafka Connect OGG configuration file (&lt;code&gt;confluent.properties&lt;/code&gt;) from&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;value.converter=org.apache.kafka.connect.json.JsonConverter&#xA;key.converter=org.apache.kafka.connect.json.JsonConverter&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;to&lt;/p&gt;</description>
    </item>
    <item>
      <title>Kafka Connect - java.lang.IncompatibleClassChangeError</title>
      <link>https://rmoff.net/2016/11/24/kafka-connect-java.lang.incompatibleclasschangeerror/</link>
      <pubDate>Thu, 24 Nov 2016 20:58:44 +0000</pubDate>
      <guid>https://rmoff.net/2016/11/24/kafka-connect-java.lang.incompatibleclasschangeerror/</guid>
      <description>&lt;p&gt;I hit this error running Kafka Connect HDFS connector from Confluent Platform v3.1.1 on BigDataLite 4.6:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;[oracle@bigdatalite ~]$ connect-standalone /etc/schema-registry/connect-avro-standalone.properties /etc/kafka-connect-hdfs/quickstart-hdfs.properties&#xA;&#xA;[...]&#xA;Exception in thread &amp;#34;main&amp;#34; java.lang.IncompatibleClassChangeError: Implementing class&#xA;        at java.lang.ClassLoader.defineClass1(Native Method)&#xA;        at java.lang.ClassLoader.defineClass(ClassLoader.java:763)&#xA;        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)&#xA;        at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)&#xA;        at java.net.URLClassLoader.access$100(URLClassLoader.java:73)&#xA;        at java.net.URLClassLoader$1.run(URLClassLoader.java:368)&#xA;        at java.net.URLClassLoader$1.run(URLClassLoader.java:362)&#xA;        at java.security.AccessController.doPrivileged(Native Method)&#xA;        at java.net.URLClassLoader.findClass(URLClassLoader.java:361)&#xA;        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)&#xA;        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)&#xA;        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)&#xA;        at java.lang.ClassLoader.defineClass1(Native Method)&#xA;        at java.lang.ClassLoader.defineClass(ClassLoader.java:763)&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;The fix was to unset the &lt;code&gt;CLASSPATH&lt;/code&gt; first:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;unset CLASSPATH&#xA;&lt;/code&gt;&lt;/pre&gt;</description>
    </item>
    <item>
      <title>boto / S3 errors</title>
      <link>https://rmoff.net/2016/10/14/boto-/-s3-errors/</link>
      <pubDate>Fri, 14 Oct 2016 08:41:30 +0000</pubDate>
      <guid>https://rmoff.net/2016/10/14/boto-/-s3-errors/</guid>
      <description>&lt;p&gt;Presented without comment, warranty, or context -  other than these might help a wandering code hacker.&lt;/p&gt;&#xA;&lt;h3 id=&#34;when-using-sigv4-you-must-specify-a-host-parameter&#34;&gt;When using SigV4, you must specify a &amp;lsquo;host&amp;rsquo; parameter&lt;/h3&gt;&#xA;&lt;pre&gt;&lt;code&gt;boto.s3.connection.HostRequiredError: BotoClientError: When using SigV4, you must specify a &#39;host&#39; parameter.&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;To fix, switch&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;conn_s3 &lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt; boto&lt;span style=&#34;color:#666&#34;&gt;.&lt;/span&gt;connect_s3()&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;for&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-python&#34; data-lang=&#34;python&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;conn_s3 &lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt; boto&lt;span style=&#34;color:#666&#34;&gt;.&lt;/span&gt;connect_s3(host&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#39;s3.amazonaws.com&amp;#39;&lt;/span&gt;)&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;You can see a list of endpoints &lt;a href=&#34;http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region&#34;&gt;here&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;h3 id=&#34;botoexceptions3responseerror-s3responseerror-400-bad-request&#34;&gt;boto.exception.S3ResponseError: S3ResponseError: 400 Bad Request&lt;/h3&gt;&#xA;&lt;p&gt;Make sure you&amp;rsquo;re specifying the correct hostname (see above) for the bucket&amp;rsquo;s region. Determine the bucket&amp;rsquo;s region from the S3 control panel, and then use the &lt;a href=&#34;http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region&#34;&gt;endpoint listed here&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OGG-15051 oracle.goldengate.util.GGException:  Class not found: &#34;kafkahandler&#34;</title>
      <link>https://rmoff.net/2016/07/29/ogg-15051-oracle.goldengate.util.ggexception-class-not-found-kafkahandler/</link>
      <pubDate>Fri, 29 Jul 2016 07:47:30 +0000</pubDate>
      <guid>https://rmoff.net/2016/07/29/ogg-15051-oracle.goldengate.util.ggexception-class-not-found-kafkahandler/</guid>
      <description>&lt;p&gt;Similar to the &lt;a href=&#34;https://rmoff.net/2016/07/28/ogg-class-not-found-com.company.kafka.customproducerrecord/&#34;&gt;previous issue&lt;/a&gt;, the &lt;a href=&#34;http://docs.oracle.com/goldengate/bd1221/gg-bd/GADBD/GUID-2561CA12-9BAC-454B-A2E3-2D36C5C60EE5.htm#GADBD457&#34;&gt;sample config&lt;/a&gt; in the docs causes another snafu:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;OGG-15051  Java or JNI exception:&#xA;oracle.goldengate.util.GGException:  Class not found: &amp;#34;kafkahandler&amp;#34;. kafkahandler&#xA; &#x9;Class not found: &amp;#34;kafkahandler&amp;#34;. kafkahandler&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;This time it&amp;rsquo;s in the &lt;code&gt;kafka.props&lt;/code&gt; file:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;gg.handler.kafkahandler.Type = kafka&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Should be&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;gg.handler.kafkahandler.type = kafka&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;No capital T in Type!&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;(Image credit: &lt;a href=&#34;https://unsplash.com/@vanschneider&#34;&gt;https://unsplash.com/@vanschneider&lt;/a&gt;)&lt;/p&gt;</description>
    </item>
    <item>
      <title>OGG -  Class not found: &#34;com.company.kafka.CustomProducerRecord&#34;</title>
      <link>https://rmoff.net/2016/07/28/ogg-class-not-found-com.company.kafka.customproducerrecord/</link>
      <pubDate>Thu, 28 Jul 2016 16:34:37 +0000</pubDate>
      <guid>https://rmoff.net/2016/07/28/ogg-class-not-found-com.company.kafka.customproducerrecord/</guid>
      <description>&lt;p&gt;In the documentation for the current release of Oracle GoldenGate for Big Data (12.2.0.1.1.011) there&amp;rsquo;s a &lt;a href=&#34;https://docs.oracle.com/goldengate/bd1221/gg-bd/GADBD/GUID-2561CA12-9BAC-454B-A2E3-2D36C5C60EE5.htm#GADBD457&#34;&gt;helpful sample configuration&lt;/a&gt;, which isn&amp;rsquo;t so helpful &amp;hellip;&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;[...]&#xA;gg.handler.kafkahandler.ProducerRecordClass = com.company.kafka.CustomProducerRecord&#xA;[...]&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;This value for &lt;code&gt;gg.handler.kafkahandler.ProducerRecordClass&lt;/code&gt; will cause a failure when you start the replicat:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;[...]&#xA;Class not found: &amp;quot;com.company.kafka.CustomProducerRecord&amp;quot;&#xA;[...]&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;If you comment this configuration item out, it&amp;rsquo;ll use &lt;a href=&#34;https://docs.oracle.com/goldengate/bd1221/gg-bd/GADBD/GUID-2561CA12-9BAC-454B-A2E3-2D36C5C60EE5.htm#GADBD455&#34;&gt;the default&lt;/a&gt; (&lt;code&gt;oracle.goldengate.handler.kafka.DefaultProducerRecord&lt;/code&gt;) and work swimingly!&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;(Image credit: &lt;a href=&#34;https://unsplash.com/@vanschneider&#34;&gt;https://unsplash.com/@vanschneider&lt;/a&gt;)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Kafka Connect JDBC - Oracle - Number of groups must be positive</title>
      <link>https://rmoff.net/2016/07/27/kafka-connect-jdbc-oracle-number-of-groups-must-be-positive/</link>
      <pubDate>Wed, 27 Jul 2016 15:23:14 +0000</pubDate>
      <guid>https://rmoff.net/2016/07/27/kafka-connect-jdbc-oracle-number-of-groups-must-be-positive/</guid>
      <description>&lt;p&gt;There are &lt;a href=&#34;https://groups.google.com/forum/#!searchin/confluent-platform/%22Number$20of$20groups$20must$20be$20positive%22&#34;&gt;various reasons for this error&lt;/a&gt;, but the one I hit was that &lt;strong&gt;the table name is case sensitive&lt;/strong&gt;, and returned from Oracle by the JDBC driver in uppercase.&lt;/p&gt;&#xA;&lt;p&gt;If you specify the tablename in your connecter config in lowercase, it won&amp;rsquo;t be matched, and this error is thrown. You can validate this by setting debug logging (edit &lt;code&gt;etc/kafka/connect-log4j.properties&lt;/code&gt; to set &lt;code&gt;log4j.rootLogger=DEBUG, stdout&lt;/code&gt;), and observe:  (&lt;em&gt;I&amp;rsquo;ve truncated some of the output for legibility&lt;/em&gt;)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Kafka Connect - HDFS with Hive Integration - SchemaProjectorException - Schema version required</title>
      <link>https://rmoff.net/2016/07/19/kafka-connect-hdfs-with-hive-integration-schemaprojectorexception-schema-version-required/</link>
      <pubDate>Tue, 19 Jul 2016 14:36:52 +0000</pubDate>
      <guid>https://rmoff.net/2016/07/19/kafka-connect-hdfs-with-hive-integration-schemaprojectorexception-schema-version-required/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve been doing some noodling around with Confluent&amp;rsquo;s Kafka Connect recently, as part of gaining a wider understanding into Kafka. If you&amp;rsquo;re not familiar with Kafka Connect &lt;a href=&#34;http://docs.confluent.io/3.0.0/connect/design.html&#34;&gt;this page&lt;/a&gt; gives a good idea of the thinking behind it.&lt;/p&gt;&#xA;&lt;p&gt;One issue that I hit defeated my Google-fu so I&amp;rsquo;m recording it here to hopefully help out fellow n00bs.&lt;/p&gt;&#xA;&lt;p&gt;The pipeline that I&amp;rsquo;d set up looked like this:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://github.com/Eneco/kafka-connect-twitter&#34;&gt;Eneco&amp;rsquo;s Twitter Source&lt;/a&gt; streaming tweets to a Kafka topic&lt;/li&gt;&#xA;&lt;li&gt;Confluent&amp;rsquo;s &lt;a href=&#34;https://docs.confluent.io/current/connect/kafka-connect-hdfs/index.html&#34;&gt;HDFS Sink&lt;/a&gt; to stream tweets to HDFS and define Hive table automagically over them&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;It worked great, but only if I didn&amp;rsquo;t enable the Hive integration part. For me the integration with Hive to automatically define schemas was one of the key interests for this platform, so I wanted to see if I could get it to work. The error I got was&lt;/p&gt;</description>
    </item>
    <item>
      <title>Configuring UPS/apcupsd</title>
      <link>https://rmoff.net/2016/07/18/configuring-ups/apcupsd/</link>
      <pubDate>Mon, 18 Jul 2016 07:59:51 +0000</pubDate>
      <guid>https://rmoff.net/2016/07/18/configuring-ups/apcupsd/</guid>
      <description>&lt;p&gt;With my new server I bought a UPS, partly just as a Good Thing, but also because I suspect a powercut fried the motherboard on a previous machine that I had, and this baby is too precious to lose ;)&lt;/p&gt;&#xA;&lt;p&gt;The idea is that the UPS will smooth out the power supply to my server, protecting it from surges or temporarily blips in power loss. If there&amp;rsquo;s a proper power cut, the UPS is connected to my server and can initiate a graceful shutdown instead of system crash. It seems unintuitive in this day and age of laptops and iPads that you just close or switch off to &amp;ldquo;suspend&amp;rdquo; them that killing the power to a server can damage it, but when you think about it just a moment more, it&amp;rsquo;s hardly surprising.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Spark sqlContext.read.json - java.io.IOException: No input paths specified in job</title>
      <link>https://rmoff.net/2016/07/13/spark-sqlcontext.read.json-java.io.ioexception-no-input-paths-specified-in-job/</link>
      <pubDate>Wed, 13 Jul 2016 04:50:16 +0000</pubDate>
      <guid>https://rmoff.net/2016/07/13/spark-sqlcontext.read.json-java.io.ioexception-no-input-paths-specified-in-job/</guid>
      <description>&lt;p&gt;Trying to use &lt;a href=&#34;http://spark.apache.org/docs/latest/sql-programming-guide.html#json-datasets&#34;&gt;SparkSQL to read a JSON file&lt;/a&gt;, from either pyspark or spark-shell, I got this error:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;java.io.IOException: No input paths specified in job&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;scala&amp;gt; sqlContext.read.json(&amp;#34;/u02/custom/twitter/twitter.json&amp;#34;)&#xA;java.io.IOException: No input paths specified in job&#xA;        at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:202)&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Despite the reference articles that I found using this local path syntax (&lt;code&gt;/u02/custom/twitter/twitter.json&lt;/code&gt;), it turned out that I needed to prefix it with &lt;code&gt;file://&lt;/code&gt;:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;scala&amp;gt; sqlContext.read.json(&amp;#34;file:///u02/custom/twitter/twitter.json&amp;#34;)&#xA;res3: org.apache.spark.sql.DataFrame = [@timestamp: string, @version: string, contributors: string, coordinates: string, created_at: string, entities: struct&amp;lt;hashtags:array&amp;lt;struct&amp;lt;indices:array&amp;lt;bigint&amp;gt;,text:string&amp;gt;&amp;gt;,media:array&amp;lt;struct&amp;lt;display_url:string,expanded_url:string,id:bigint,id_str:string,indices:array&amp;lt;bigint&amp;gt;,media_url:string,media_url_https:string,sizes:struct&amp;lt;large:struct&amp;lt;h:bigint,resize:string,w:bigint&amp;gt;,medium:struct&amp;lt;h:bigint,resize:string,w:bigint&amp;gt;,small:struct&amp;lt;h:bigint,resize:string,w:bigint&amp;gt;,thumb:struct&amp;lt;h:bigint,resize:string,w:bigint&amp;gt;&amp;gt;,source_status_id:bigint,source_status_id_str:string,source_user_id:bigint,source_user_id_str:string,type:string,url:string&amp;gt;&amp;gt;,symbols:array&amp;lt;struct&amp;lt;indices:array&amp;lt;bigint&amp;gt;,text:string&amp;gt;&amp;gt;,urls:array&amp;lt;struct&amp;lt;display_url:string,expanded_url:string...&#xA;scala&amp;gt;&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;An alternative to &lt;code&gt;file://&lt;/code&gt; is &lt;code&gt;hdfs://&lt;/code&gt;, assuming you have some data residing there too:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Proxmox 4 Containers - ssh - ssh_exchange_identification: read: Connection reset by peer</title>
      <link>https://rmoff.net/2016/07/05/proxmox-4-containers-ssh-ssh_exchange_identification-read-connection-reset-by-peer/</link>
      <pubDate>Tue, 05 Jul 2016 15:20:37 +0000</pubDate>
      <guid>https://rmoff.net/2016/07/05/proxmox-4-containers-ssh-ssh_exchange_identification-read-connection-reset-by-peer/</guid>
      <description>&lt;p&gt;&lt;strong&gt;TL;DR&lt;/strong&gt; When defining networking on Proxmox 4 LXC containers, use an appropriate CIDR suffix (e.g. 24) - don&amp;rsquo;t use 32!&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;On my &lt;a href=&#34;https://rmoff.net/2016/06/07/commissioning-my-proxmox-server-os-and-filesystems/&#34;&gt;Proxmox 4 server&lt;/a&gt; I&amp;rsquo;m running a whole load of lovely LXC containers. Unfortunately, I had trouble connecting to them. From a client machine, I got the error&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;ssh_exchange_identification: read: Connection reset by peer&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;On the server I was connecting to (which I could get a console for through the Proxmox GUI, or a session on using &lt;code&gt;pct enter&lt;/code&gt; from the Proxmox host) I ran a SSHD process with debug to see what was happening:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Reset Hue password</title>
      <link>https://rmoff.net/2016/07/05/reset-hue-password/</link>
      <pubDate>Tue, 05 Jul 2016 13:27:06 +0000</pubDate>
      <guid>https://rmoff.net/2016/07/05/reset-hue-password/</guid>
      <description>&lt;p&gt;(&lt;a href=&#34;http://gethue.com/password-management-in-hue/&#34;&gt;Ref&lt;/a&gt;)&lt;/p&gt;&#xA;&lt;p&gt;The bit that caught me out was this kept failing with&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;Error: Password not present&#x9;&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;and a Python stack trace that ended with&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;subprocess.CalledProcessError: Command &#39;/var/run/cloudera-scm-agent/process/78-hue-HUE_SERVER/altscript.sh sec-1-secret_key&#39; returned non-zero exit status 1&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;The answer (it &lt;em&gt;seems&lt;/em&gt;) is to ensure that &lt;code&gt;HUE_SECRET_KEY&lt;/code&gt; is set (to any value!)&lt;/p&gt;&#xA;&lt;p&gt;Launch shell:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;export HUE_SECRET_KEY=foobar&#xA;/opt/cloudera/parcels/CDH-5.7.1-1.cdh5.7.1.p0.11/lib/hue/build/env/bin/hue shell&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;Reset password for &lt;code&gt;hue&lt;/code&gt;, activate account and make it superuser&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;from django.contrib.auth.models import User&#xA;user = User.objects.get(username=&#39;hue&#39;)&#xA;user.is_active=True&#xA;user.save()&#xA;user.is_superuser=True&#xA;user.save()&#xA;user.set_password(&#39;hue&#39;)&#xA;user.save()&#xA;&lt;/code&gt;&lt;/pre&gt;</description>
    </item>
    <item>
      <title>Apache Drill - conflicting jar problem - &#34;No current connection&#34;</title>
      <link>https://rmoff.net/2016/06/20/apache-drill-conflicting-jar-problem-no-current-connection/</link>
      <pubDate>Mon, 20 Jun 2016 19:04:18 +0000</pubDate>
      <guid>https://rmoff.net/2016/06/20/apache-drill-conflicting-jar-problem-no-current-connection/</guid>
      <description>&lt;p&gt;Vanilla download of Apache Drill 1.6, attempting to follow the Followed the &lt;a href=&#34;https://drill.apache.org/docs/drill-in-10-minutes/&#34;&gt;Drill in 10 Minutes&lt;/a&gt; tutorial - but kept just getting the error &lt;code&gt;No current connection&lt;/code&gt;. Here&amp;rsquo;s an example:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#666&#34;&gt;[&lt;/span&gt;oracle@bigdatalite apache-drill-1.6.0&lt;span style=&#34;color:#666&#34;&gt;]&lt;/span&gt;$ ./bin/drill-embedded&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;Java HotSpot&lt;span style=&#34;color:#666&#34;&gt;(&lt;/span&gt;TM&lt;span style=&#34;color:#666&#34;&gt;)&lt;/span&gt; 64-Bit Server VM warning: ignoring option &lt;span style=&#34;color:#19177c&#34;&gt;MaxPermSize&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;512M; support was removed in 8.0&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;com.fasterxml.jackson.databind.JavaType.isReferenceType&lt;span style=&#34;color:#666&#34;&gt;()&lt;/span&gt;Z&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;apache drill 1.6.0&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;the only truly happy people are children, the creative minority and drill users&amp;#34;&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;0: jdbc:drill:zk&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;local&amp;gt; SELECT version FROM sys.version;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;No current connection&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;0: jdbc:drill:zk&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;local&amp;gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;Whether &lt;code&gt;SELECT version FROM sys.version;&lt;/code&gt; or any other command - same result - &lt;code&gt;No current connection&lt;/code&gt;. Trying to run Drill in distributed mode also failed, with a class error&lt;/p&gt;</description>
    </item>
    <item>
      <title>ClassNotFoundException with MongoDB-Hadoop in Hive</title>
      <link>https://rmoff.net/2016/06/15/classnotfoundexception-with-mongodb-hadoop-in-hive/</link>
      <pubDate>Wed, 15 Jun 2016 17:58:19 +0000</pubDate>
      <guid>https://rmoff.net/2016/06/15/classnotfoundexception-with-mongodb-hadoop-in-hive/</guid>
      <description>&lt;p&gt;I wasted &lt;em&gt;literally&lt;/em&gt; two hours on this one, so putting down a note to hopefully help future Googlers.&lt;/p&gt;&#xA;&lt;h3 id=&#34;symptom&#34;&gt;Symptom&lt;/h3&gt;&#xA;&lt;p&gt;Here&amp;rsquo;s all the various errors that I got in the &lt;code&gt;hive-server2.log&lt;/code&gt; during my attempts to get a &lt;code&gt;CREATE EXTERNABLE TABLE&lt;/code&gt; to work against a MongoDB table in Hive:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.io.BSONWritable&#xA;Caused by: java.lang.ClassNotFoundException: com.mongodb.util.JSON&#xA;Caused by: java.lang.ClassNotFoundException: org.bson.conversions.Bson&#xA;Caused by: java.lang.ClassNotFoundException: org.bson.io.OutputBuffer&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Whilst Hive would throw errors along the lines of:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Erroneous SwapFree on LXC causes problems with CDH install</title>
      <link>https://rmoff.net/2016/06/15/erroneous-swapfree-on-lxc-causes-problems-with-cdh-install/</link>
      <pubDate>Wed, 15 Jun 2016 17:52:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/06/15/erroneous-swapfree-on-lxc-causes-problems-with-cdh-install/</guid>
      <description>&lt;p&gt;Installing CDH 5.7 on Linux Containers (LXC) hosted on Proxmox 4. Everything was going well until &lt;strong&gt;Cluster Setup&lt;/strong&gt;, and which point it failed on &lt;strong&gt;Start YARN (MR2 included)&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;Completed only 0/1 steps. First failure: Failed to execute command Start on service YARN (MR2 Included)&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2016/06/cdh-yarn-01-1.png&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;Log &lt;code&gt;/var/log/hadoop-yarn/hadoop-cmf-yarn-NODEMANAGER-cdh57-01-node-02.moffatt.me.log.out&lt;/code&gt; showed:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;org.apache.hadoop.service.AbstractService: Service containers-monitor failed in state INITED; cause: java.lang.NumberFormatException: For input string: &amp;quot;18446744073709550364&amp;quot;&#xA;java.lang.NumberFormatException: For input string: &amp;quot;18446744073709550364&amp;quot;&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;Looking down the stack trace, this came from &lt;code&gt;org.apache.hadoop.yarn.util.LinuxResourceCalculatorPlugin.readProcMemInfoFile&lt;/code&gt;, which the &lt;a href=&#34;http://grepcode.com/file/repo1.maven.org/maven2/org.apache.hadoop/hadoop-yarn-common/0.23.1/org/apache/hadoop/yarn/util/LinuxResourceCalculatorPlugin.java#LinuxResourceCalculatorPlugin.readProcMemInfoFile%28boolean%29&#34;&gt;source code&lt;/a&gt; shows is reading &lt;code&gt;/proc/meminfo&lt;/code&gt;. Looking at this file on each node showed:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Reviving a bricked EdgeRouter Lite (ERL) from a Mac</title>
      <link>https://rmoff.net/2016/06/08/reviving-a-bricked-edgerouter-lite-erl-from-a-mac/</link>
      <pubDate>Wed, 08 Jun 2016 15:58:30 +0000</pubDate>
      <guid>https://rmoff.net/2016/06/08/reviving-a-bricked-edgerouter-lite-erl-from-a-mac/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve got an &lt;a href=&#34;https://www.ubnt.com/edgemax/edgerouter-lite/&#34;&gt;EdgeRouter LITE&lt;/a&gt; (ERL) which I used as my home router until a powercut fried it a while ago (&lt;a href=&#34;https://community.ubnt.com/t5/EdgeMAX/2nd-Failed-ERLite/m-p/601815&#34;&gt;looks like I&amp;rsquo;m not the only one to have this issue&lt;/a&gt;). The symptoms were it powering on but not giving any DHCP addresses, or after a factory reset responding on the default IP of 192.168.1.1. It was a real shame, because it had been a great bit of kit up until then. I am a complete hack when it comes to networking, and it struck the balance right between letting me do what I needed to do, without overwhelming me with complexity.&#xA;I&amp;rsquo;d replaced it with a SonicWall TZ105 but having utterly failed to get the latter to permit OpenVPN traffic (so I can access my home server when on the road), which I had done with no problem on the ERL I thought I&amp;rsquo;d try and resurrect the ERL using &lt;a href=&#34;https://help.ubnt.com/hc/en-us/articles/204959514-EdgeMAX-Last-resort-recovery-of-failed-EdgeOS-device&#34;&gt;the instructions here&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Running a Docker Container on Proxmox for BitTorrent Sync</title>
      <link>https://rmoff.net/2016/06/07/running-a-docker-container-on-proxmox-for-bittorrent-sync/</link>
      <pubDate>Tue, 07 Jun 2016 21:43:26 +0000</pubDate>
      <guid>https://rmoff.net/2016/06/07/running-a-docker-container-on-proxmox-for-bittorrent-sync/</guid>
      <description>&lt;p&gt;(&lt;a href=&#34;https://rmoff.net/2016/06/07/a-new-arrival/&#34;&gt;Previously&lt;/a&gt;, &lt;a href=&#34;https://rmoff.net/2016/06/07/commissioning-my-proxmox-server-os-and-filesystems/&#34;&gt;previously&lt;/a&gt;, &lt;a href=&#34;https://rmoff.net/2016/06/07/importing-vmware-and-virtualbox-vms-to-proxmox/&#34;&gt;previously&lt;/a&gt;)&lt;/p&gt;&#xA;&lt;p&gt;Since Proxmox 4 has a recent Linux kernel and mainline one at that, it means that Docker can be run on it. I&amp;rsquo;ve yet to really dig into Docker and work out when it makes sense in place of Linux Containers (LXC), so this is going to be a learning experience for me.&lt;/p&gt;&#xA;&lt;p&gt;To install Docker, add Backports repo to apt:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;root@proxmox01:~# cat /etc/apt/sources.list.d/backports.list&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;deb http://ftp.debian.org/debian jessie-backports main&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;And then install:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Importing VMWare and VirtualBox VMs to Proxmox</title>
      <link>https://rmoff.net/2016/06/07/importing-vmware-and-virtualbox-vms-to-proxmox/</link>
      <pubDate>Tue, 07 Jun 2016 21:14:26 +0000</pubDate>
      <guid>https://rmoff.net/2016/06/07/importing-vmware-and-virtualbox-vms-to-proxmox/</guid>
      <description>&lt;p&gt;(&lt;a href=&#34;https://rmoff.net/2016/06/07/a-new-arrival/&#34;&gt;Previously&lt;/a&gt;, &lt;a href=&#34;https://rmoff.net/2016/06/07/commissioning-my-proxmox-server-os-and-filesystems/&#34;&gt;previously&lt;/a&gt;)&lt;/p&gt;&#xA;&lt;p&gt;I&amp;rsquo;ve got a bunch of existing VirtualBox and VMWare VMs that I want to run on Proxmox. Eventually I&amp;rsquo;ll migrate them to containers, but for the time being run them as &amp;ldquo;fat&amp;rdquo; VMs using Proxmox&amp;rsquo;s KVM virtualisation. After copying the OVA files that I had to the server, I uncompressed them:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;root@proxmox01:/data04/vms/bdl44-biwa# &lt;span style=&#34;color:#008000&#34;&gt;cd&lt;/span&gt; ../bdl44&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;root@proxmox01:/data04/vms/bdl44# ll&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;total &lt;span style=&#34;color:#666&#34;&gt;27249328&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;-rw------- &lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt; root root &lt;span style=&#34;color:#666&#34;&gt;27903306752&lt;/span&gt; Jun  &lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt; 10:14 BigDataLite440.ova&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;root@proxmox01:/data04/vms/bdl44# tar -xf BigDataLite440.ova&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;root@proxmox01:/data04/vms/bdl44# ll&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;total &lt;span style=&#34;color:#666&#34;&gt;54498668&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;-rw------- &lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt; root root  &lt;span style=&#34;color:#666&#34;&gt;7300486656&lt;/span&gt; Feb &lt;span style=&#34;color:#666&#34;&gt;18&lt;/span&gt; 21:25 BigDataLite440-disk1.vmdk&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;-rw------- &lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt; root root  &lt;span style=&#34;color:#666&#34;&gt;1261044224&lt;/span&gt; Feb &lt;span style=&#34;color:#666&#34;&gt;18&lt;/span&gt; 21:26 BigDataLite440-disk2.vmdk&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;-rw------- &lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt; root root &lt;span style=&#34;color:#666&#34;&gt;19295202816&lt;/span&gt; Feb &lt;span style=&#34;color:#666&#34;&gt;18&lt;/span&gt; 21:48 BigDataLite440-disk3.vmdk&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;-rw------- &lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt; root root    &lt;span style=&#34;color:#666&#34;&gt;46550528&lt;/span&gt; Feb &lt;span style=&#34;color:#666&#34;&gt;18&lt;/span&gt; 21:48 BigDataLite440-disk4.vmdk&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;-rw------- &lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt; root root &lt;span style=&#34;color:#666&#34;&gt;27903306752&lt;/span&gt; Jun  &lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt; 10:14 BigDataLite440.ova&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;-rw------- &lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt; root root       &lt;span style=&#34;color:#666&#34;&gt;19619&lt;/span&gt; Feb &lt;span style=&#34;color:#666&#34;&gt;18&lt;/span&gt; 21:15 BigDataLite440.ovf&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;and then converted each disk image to qcow2 format:&#xA;(&lt;em&gt;You can read more about how and why &lt;a href=&#34;https://www.jamescoyle.net/how-to/1218-upload-ova-to-proxmox-kvm&#34;&gt;here&lt;/a&gt; and &lt;a href=&#34;https://pve.proxmox.com/wiki/Migration_of_servers_to_Proxmox_VE#VMware_to_Proxmox_VE_.28KVM.29&#34;&gt;here&lt;/a&gt;&lt;/em&gt;).&lt;/p&gt;</description>
    </item>
    <item>
      <title>Commissioning my Proxmox Server - OS and filesystems</title>
      <link>https://rmoff.net/2016/06/07/commissioning-my-proxmox-server-os-and-filesystems/</link>
      <pubDate>Tue, 07 Jun 2016 21:03:22 +0000</pubDate>
      <guid>https://rmoff.net/2016/06/07/commissioning-my-proxmox-server-os-and-filesystems/</guid>
      <description>&lt;p&gt;(&lt;a href=&#34;https://rmoff.net/2016/06/07/a-new-arrival/&#34;&gt;Previously&lt;/a&gt;)&lt;/p&gt;&#xA;&lt;p&gt;With my server in place, I ran a memtest on it &amp;hellip; which with 128G took a while ;)&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2016/06/IMG_7889.jpg&#34; alt=&#34;memtest&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;And then installed &lt;a href=&#34;https://www.proxmox.com/en/&#34;&gt;Proxmox 4&lt;/a&gt;, using a bootable USB that I&amp;rsquo;d created on my Mac from the ISO downloaded from Proxmox&amp;rsquo;s website. To create the bootable USB, create the &lt;code&gt;img&lt;/code&gt; file:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;hdiutil convert -format UDRW -o target.img source.iso&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;and then burn it to USB:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;sudo dd &lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;if&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;target.img &lt;span style=&#34;color:#19177c&#34;&gt;of&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;/dev/rdiskN &lt;span style=&#34;color:#19177c&#34;&gt;bs&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;1m&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;Replace &lt;strong&gt;&lt;code&gt;N&lt;/code&gt;&lt;/strong&gt; with the correct device based on &lt;code&gt;diskutil list&lt;/code&gt; output. Don&amp;rsquo;t get it wrong, else you&amp;rsquo;ll properly knacker your machine :D&lt;/p&gt;</description>
    </item>
    <item>
      <title>A New Arrival</title>
      <link>https://rmoff.net/2016/06/07/a-new-arrival/</link>
      <pubDate>Tue, 07 Jun 2016 20:43:20 +0000</pubDate>
      <guid>https://rmoff.net/2016/06/07/a-new-arrival/</guid>
      <description>&lt;p&gt;After a long and painful delivery, I&amp;rsquo;m delighted to announce the arrival of a new addition to my household &amp;hellip; :&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2016/06/IMG_3813.jpg&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;This &lt;a href=&#34;https://www.scan.co.uk/3xs/shared/98f6ed5b-7fc4-492c-b66c-3c0e4117dd9c&#34;&gt;custom-build from Scan 3XS&lt;/a&gt; is sat in my study quietly humming away. I&amp;rsquo;m going to use it for hosting VMs for R&amp;amp;D on OBIEE, Big Data Lite, Elastic, InfluxDB, Kafka, etc.&#xA;I&amp;rsquo;ll blog various installations that I&amp;rsquo;ve done on it as a reference for myself, and anyone else interested. Which I guess means, myself ;)&lt;/p&gt;</description>
    </item>
    <item>
      <title>New version of BigDataLite VM from Oracle</title>
      <link>https://rmoff.net/2016/06/06/new-version-of-bigdatalite-vm-from-oracle/</link>
      <pubDate>Mon, 06 Jun 2016 22:28:25 +0000</pubDate>
      <guid>https://rmoff.net/2016/06/06/new-version-of-bigdatalite-vm-from-oracle/</guid>
      <description>&lt;p&gt;Oracle&amp;rsquo;s excellent &lt;a href=&#34;http://www.oracle.com/technetwork/database/bigdata-appliance/oracle-bigdatalite-2104726.html&#34;&gt;Big Data Lite VM&lt;/a&gt; has been updated, to version 4.5.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://www.oracle.com/technetwork/database/bigdata-appliance/oracle-bigdatalite-2104726.html#introduction&#34;&gt;Download it here&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;Changes:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;CDH 5.5 -&amp;gt; 5.7&lt;/li&gt;&#xA;&lt;li&gt;Big Data Spatial and Graph 1.1 -&amp;gt; 1.2&lt;/li&gt;&#xA;&lt;li&gt;Big Data Discovery 1.1 -&amp;gt; 1.2&lt;/li&gt;&#xA;&lt;li&gt;Oracle Big Data Connectors 4.4 -&amp;gt; 4.5&lt;/li&gt;&#xA;&lt;li&gt;Oracle NoSQL 3.5 -&amp;gt; 4.0&lt;/li&gt;&#xA;&lt;li&gt;GoldenGate 12.2.0.1 -&amp;gt; 12.2.0.1.1&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>OBIEE 12c blog posts</title>
      <link>https://rmoff.net/2016/06/01/obiee-12c-blog-posts/</link>
      <pubDate>Wed, 01 Jun 2016 22:30:14 +0000</pubDate>
      <guid>https://rmoff.net/2016/06/01/obiee-12c-blog-posts/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve been spending some interesting hours digging into OBIEE 12c recently, with some interesting blog posts to show for it. Some of it is just curiosities discovered along the way, but the real meaty stuff is the in the RESTful APIs - lots of potential here for cool integrations I think&amp;hellip;&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2016/05/24/lifting-the-lid-on-obiee-12c-web-services-part-1/&#34;&gt;Lifting the Lid on OBIEE 12c Web Services - Part 1&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2016/05/28/lifting-the-lid-on-obiee-12c-web-services-part-2/&#34;&gt;Lifting the Lid on OBIEE 12c Web Services - Part 2&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://ritt.md/obiee12c-xsa-dss&#34;&gt;Extended Subject Areas (XSA) and the Data Set Service &lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://ritt.md/obi-12c-cache&#34;&gt;Changes in BI Server Cache Behaviour in OBIEE 12c : &lt;code&gt;OBIS_REFRESH_CACHE&lt;/code&gt; &lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2016/05/27/dynamic-naming-of-obiee-12c-service-instance-exports/&#34;&gt;Dynamic Naming of OBIEE 12c Service Instance Exports&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2016/05/27/obiee-12c-add-data-source-in-answers/&#34;&gt;OBIEE 12c - &amp;ldquo;Add Data Source&amp;rdquo; in Answers&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;(Photo credit: &lt;a href=&#34;https://unsplash.com/@jluebke&#34;&gt;https://unsplash.com/@jluebke&lt;/a&gt;)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Presentation Services Logsources in OBIEE 12c</title>
      <link>https://rmoff.net/2016/06/01/presentation-services-logsources-in-obiee-12c/</link>
      <pubDate>Wed, 01 Jun 2016 11:03:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/06/01/presentation-services-logsources-in-obiee-12c/</guid>
      <description>&lt;p&gt;Presentation Services can provide some very detailed logs, useful for troubleshooting, performance tracing, and general poking around. &lt;a href=&#34;http://www.rittmanmead.com/2014/11/auditing-obiee-presentation-catalog-activity-with-custom-log-filters/&#34;&gt;See here&lt;/a&gt; for details.&lt;/p&gt;&#xA;&lt;p&gt;There&amp;rsquo;s no &lt;code&gt;bi-init.sh&lt;/code&gt; in 12c, so need to set up the &lt;code&gt;LD_LIBRARY_PATH&lt;/code&gt; ourselves:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000&#34;&gt;export&lt;/span&gt; &lt;span style=&#34;color:#19177c&#34;&gt;LD_LIBRARY_PATH&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;&lt;span style=&#34;color:#19177c&#34;&gt;$LD_LIBRARY_PATH&lt;/span&gt;:/app/oracle/biee/bi/bifoundation/web/bin/:/app/oracle/biee/bi/lib/:/app/oracle/biee/lib/:/app/oracle/biee/bi/bifoundation/odbc/lib/&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;Run &lt;code&gt;sawserver&lt;/code&gt; with flag to list all log sources&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;/app/oracle/biee/bi/bifoundation/web/bin/sawserver -logsources &amp;gt; saw_logsources_12.2.1.txt&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Full list: &lt;a href=&#34;https://gist.github.com/rmoff/e3be9009da6130839c71181cb58509a0&#34;&gt;https://gist.github.com/rmoff/e3be9009da6130839c71181cb58509a0&lt;/a&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Lifting the Lid on OBIEE 12c Web Services - Part 2</title>
      <link>https://rmoff.net/2016/05/28/lifting-the-lid-on-obiee-12c-web-services-part-2/</link>
      <pubDate>Sat, 28 May 2016 20:30:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/28/lifting-the-lid-on-obiee-12c-web-services-part-2/</guid>
      <description>&lt;p&gt;In OBIEE 12c &lt;code&gt;data-model-cmd&lt;/code&gt; is a wrapper for some java code which ultimately calls an internal RESTful web service in OBIEE 12c, &lt;code&gt;bi-lcm&lt;/code&gt;. We saw in the &lt;a href=&#34;https://rmoff.net/2016/05/24/lifting-the-lid-on-obiee-12c-web-services-part-1/&#34;&gt;previous post&lt;/a&gt; how these internal web services can be opened up slightly, and we&amp;rsquo;re going to do the same again here. Which means, time for the same caveat:&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;&lt;strong&gt;None of these Web Services are documented, and they should therefore be assumed to be completely unsupported by Oracle. This article is purely for geek interest. Using undocumented APIs leaves you at risk of the API changing at any time.&lt;/strong&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Dynamic Naming of OBIEE 12c Service Instance Exports</title>
      <link>https://rmoff.net/2016/05/27/dynamic-naming-of-obiee-12c-service-instance-exports/</link>
      <pubDate>Fri, 27 May 2016 09:13:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/27/dynamic-naming-of-obiee-12c-service-instance-exports/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;http://docs.oracle.com/middleware/1221/biee/BIESG/configrepos.htm#BIESG9314&#34;&gt;&lt;code&gt;exportServiceInstance&lt;/code&gt;&lt;/a&gt; will export the RPD, Presentation Catalog, and Security model (application roles &amp;amp; policies etc &amp;ndash; but &lt;em&gt;not&lt;/em&gt; WLS LDAP) into a single &lt;code&gt;.bar&lt;/code&gt; file, from which they can be imported to another environment, or restored to the same one at a later date (e.g. for backup/restore).&lt;/p&gt;&#xA;&lt;p&gt;To run &lt;code&gt;exportServiceInstance&lt;/code&gt; you need to launch WLST first. The following demonstrates how to call it, and embeds the current timestamp &amp;amp; machine details in the backup (useful info, and also makes the backup name unique each time).&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE 12c - &#34;Add Data Source&#34; in Answers</title>
      <link>https://rmoff.net/2016/05/27/obiee-12c-add-data-source-in-answers/</link>
      <pubDate>Fri, 27 May 2016 08:44:24 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/27/obiee-12c-add-data-source-in-answers/</guid>
      <description>&lt;p&gt;So this had me scratching my head for a good hour today. Comparing SampleApp v511 against a vanilla OBIEE 12c install I&amp;rsquo;d done, one had &amp;ldquo;Add Data Source&amp;rdquo; as an option in Answers, the other didn&amp;rsquo;t. The strange thing was that the option &lt;em&gt;wasn&amp;rsquo;t&lt;/em&gt; there in SampleApp &amp;ndash; and usually that has all the bells and whistles enabled.&lt;/p&gt;&#xA;&lt;p&gt;After checking and re-checking the &lt;strong&gt;Manage Privileges&lt;/strong&gt; option, and even the Application Policy grants, and the manual, I hit MoS - and turned up &lt;a href=&#34;https://support.oracle.com/epmos/faces/DocContentDisplay?id=2093886.1&#34;&gt;Doc ID 2093886.1&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>York Fry Ups</title>
      <link>https://rmoff.net/2016/05/24/york-fry-ups/</link>
      <pubDate>Tue, 24 May 2016 21:45:09 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/24/york-fry-ups/</guid>
      <description>&lt;p&gt;I had the pleasure of not one but two fry-ups in York, UK last weekend.&lt;/p&gt;&#xA;&lt;p&gt;The first was courtesy of &lt;a href=&#34;https://bills-website.co.uk/&#34;&gt;Bill&amp;rsquo;s Restaurant&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2016/05/CjDJFSBWYAAt6W3-jpg-large.jpg&#34; alt=&#34;Fry Up at Bill&amp;rsquo;s Restaurant&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;Overall, pretty good, and I&amp;rsquo;ve had much worse. All the ingredients seemed decent. The black pudding was overcooked and almost biscuit-like, but that&amp;rsquo;s my only serious grumble. The bacon was cooked well. That black pudding, beans and the mashed/friend potato thing were each extra charges annoyed me. Particularly with a hangover, I just want to be able to order a full english, without playing &lt;a href=&#34;https://en.wikipedia.org/wiki/Mastermind_(board_game)&#34;&gt;Mastermind&lt;/a&gt; to work out what&amp;rsquo;s in or not.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Lifting the Lid on OBIEE 12c Web Services - Part 1</title>
      <link>https://rmoff.net/2016/05/24/lifting-the-lid-on-obiee-12c-web-services-part-1/</link>
      <pubDate>Tue, 24 May 2016 21:15:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/24/lifting-the-lid-on-obiee-12c-web-services-part-1/</guid>
      <description>&lt;p&gt;Architecturally, OBIEE 12c is - on the surface - pretty similar to OBIEE 11g. Sure, we&amp;rsquo;ve lost &lt;a href=&#34;https://docs.oracle.com/middleware/1221/biee/BIESG/whatsnew.htm#CJAFBCJC&#34;&gt;OPMN in favour of Node Manager&lt;/a&gt;, but all the old favourites are there - WebLogic Servers, BI Server (nqsserver / OBIS), Presentation Services (sawserver / OBIPS), and so on.&lt;/p&gt;&#xA;&lt;p&gt;But, scratch beneath the surface, or have a gander at &lt;a href=&#34;http://www.ioug.org/p/cm/ld/fid=985&amp;amp;tid=743&amp;amp;sid=7207&#34;&gt;slide decks such as this one from BIWA this year&lt;/a&gt;, and you realise that change is afoot. Whilst the OBIEE core is still built around proprietary &amp;lsquo;black box&amp;rsquo; protocols (SAW from analytics to sawserver on port 9710, NQS ODBC from sawserver to nqsserver, cluster management on 9706 to nqsclustercontroller), there are now &lt;a href=&#34;https://en.wikipedia.org/wiki/Representational_state_transfer&#34;&gt;REST-based&lt;/a&gt; web services springing up (in addition to the &lt;a href=&#34;https://docs.oracle.com/middleware/1221/biee/BIEIT/soa_overview.htm#BABHJJAC&#34;&gt;existing SOAP&lt;/a&gt; services that have been there since at least 10g). Whilst the REST services are there under the covers, &lt;strong&gt;they are not documented nor user-servicable&lt;/strong&gt;, but they are there. But let me re-iterate:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Kibana Timelion - Series Calculations - Difference from One Week Ago</title>
      <link>https://rmoff.net/2016/05/23/kibana-timelion-series-calculations-difference-from-one-week-ago/</link>
      <pubDate>Mon, 23 May 2016 09:46:28 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/23/kibana-timelion-series-calculations-difference-from-one-week-ago/</guid>
      <description>&lt;p&gt;I wrote recently about &lt;a href=&#34;https://rmoff.net/2016/03/29/experiments-with-kibana-timelion/&#34;&gt;Kibana&amp;rsquo;s excellent Timelion feature&lt;/a&gt;, which brings time-series visualisations to Kibana. In the comments Ben Huang asked:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;do you know how to show whats the difference between this Friday and last Friday by Timelion?&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;So I thought I&amp;rsquo;d answer properly here.&lt;/p&gt;&#xA;&lt;p&gt;Timelion includes mathematical functions including &lt;code&gt;add&lt;/code&gt; and &lt;code&gt;subtract&lt;/code&gt;, as well as the ability to show data &lt;code&gt;offset&lt;/code&gt; by an amount of time. So to answer Ben&amp;rsquo;s query, we combine the two.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE 12c hangs at startup - Starting AdminServer ...</title>
      <link>https://rmoff.net/2016/05/20/obiee-12c-hangs-at-startup-starting-adminserver-.../</link>
      <pubDate>Fri, 20 May 2016 14:22:21 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/20/obiee-12c-hangs-at-startup-starting-adminserver-.../</guid>
      <description>&lt;p&gt;Running the OBIEE 12c startup on Windows:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;C:\app\oracle\fmw\user_projects\domains\bi\bitools\bin\start.cmd&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;Just hangs at:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;Starting AdminServer ...&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;No CPU being consumed, very odd. But then &amp;hellip; looking at &lt;code&gt;DOMAIN_HOME\servers\AdminServer\logs\AdminServer.out&lt;/code&gt; shows the last log entry was:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;Enter username to boot WebLogic server:&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;And that&amp;rsquo;s bad news, cos that&amp;rsquo;s an interactive prompt, but not echo&amp;rsquo;d to the console output of the startup command, and there&amp;rsquo;s no way to interact with it.&lt;/p&gt;&#xA;&lt;p&gt;The &lt;code&gt;start.cmd&lt;/code&gt; was being called by adding it to the Startup folder (&lt;code&gt;C:\ProgramData\Microsoft\Windows\Start Menu\Programs\StartUp&lt;/code&gt;), and I guess it was something about this that stopped the prompt coming back to the console, because when I ran it manually from the command prompt, I got this:&lt;/p&gt;</description>
    </item>
    <item>
      <title>oracle.bi.bar.exceptions.UnSupportedBarException: The Bar file provided as input is not supported in this BI Platfrom release.</title>
      <link>https://rmoff.net/2016/05/19/oracle.bi.bar.exceptions.unsupportedbarexception-the-bar-file-provided-as-input-is-not-supported-in-this-bi-platfrom-release./</link>
      <pubDate>Thu, 19 May 2016 10:06:03 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/19/oracle.bi.bar.exceptions.unsupportedbarexception-the-bar-file-provided-as-input-is-not-supported-in-this-bi-platfrom-release./</guid>
      <description>&lt;p&gt;Another quick note on OBIEE 12c, this time on the &lt;a href=&#34;https://docs.oracle.com/middleware/1221/biee/BIESG/configrepos.htm#BIESG9316&#34;&gt;importServiceInstance&lt;/a&gt; command. If you run it with a BAR file that doesn&amp;rsquo;t exist, it&amp;rsquo;ll fail (obviously), but the error at the end of the stack trace is slightly confusing:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;oracle.bi.bar.exceptions.UnSupportedBarException: &#xA;The Bar file provided as input is not supported in this BI Platfrom release.&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;Scrolling back up the stack trace does show the error message:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;SEVERE: Failed in reading bar file. [...]&#xA;java.io.FileNotFoundException: [...] &#xA;(The system cannot find the file specified)&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;So &amp;hellip; RTEM (Read the Fantastic Error Message) in full, don&amp;rsquo;t just skim to the end&amp;hellip;&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE Baseline Validation Tool - Parameter &#39;directory&#39; is not a directory</title>
      <link>https://rmoff.net/2016/05/18/obiee-baseline-validation-tool-parameter-directory-is-not-a-directory/</link>
      <pubDate>Wed, 18 May 2016 15:35:46 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/18/obiee-baseline-validation-tool-parameter-directory-is-not-a-directory/</guid>
      <description>&lt;p&gt;Interesting quirk in running Baseline Validation Tool for OBIEE here. If you invoke &lt;code&gt;obibvt&lt;/code&gt; from the &lt;code&gt;bin&lt;/code&gt; folder, it errors with &lt;strong&gt;Parameter &amp;lsquo;directory&amp;rsquo; is not a directory&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;PS C:\OracleBI-BVT&amp;gt; cd bin&#xA;PS C:\OracleBI-BVT\bin&amp;gt; .\obibvt -config C:\OracleBI-BVT\bin\bvt-config.xml -deployment current&#xA; INFO: Result folder: Results\current&#xA;Throwable: Parameter &amp;#39;directory&amp;#39; is not a directory&#xA;Thread[main,5,main]&#xA;SEVERE: Unhandled Exception&#xA;SEVERE: java.lang.IllegalArgumentException: Parameter &amp;#39;directory&amp;#39; is not a directory&#xA;       at org.apache.commons.io.FileUtils.validateListFilesParameters(FileUtils.java:545)&#xA;       at org.apache.commons.io.FileUtils.listFiles(FileUtils.java:521)&#xA;       at org.apache.commons.io.FileUtils.listFiles(FileUtils.java:691)&#xA;       at com.oracle.biee.bvt.UpgradeTool.loadPlugins(UpgradeTool.java:537)&#xA;       at com.oracle.biee.bvt.UpgradeTool.runPluginTests(UpgradeTool.java:644)&#xA;       at com.oracle.biee.bvt.UpgradeTool.run(UpgradeTool.java:812)&#xA;       at com.oracle.biee.bvt.UpgradeTool.main(UpgradeTool.java:999)&#xA;&#xA;PS C:\OracleBI-BVT\bin&amp;gt;&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Solution? Run the exact same command, but from the folder above:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Monitoring Logstash Ingest Rates with Elasticsearch, Kibana, and Timelion</title>
      <link>https://rmoff.net/2016/05/13/monitoring-logstash-ingest-rates-with-elasticsearch-kibana-and-timelion/</link>
      <pubDate>Fri, 13 May 2016 05:45:19 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/13/monitoring-logstash-ingest-rates-with-elasticsearch-kibana-and-timelion/</guid>
      <description>&lt;p&gt;Yesterday I wrote about &lt;a href=&#34;https://rmoff.net/2016/05/12/monitoring-logstash-ingest-rates-with-influxdb-and-grafana/&#34;&gt;Monitoring Logstash Ingest Rates with InfluxDB and Grafana&lt;/a&gt;, in which InfluxDB provided the data store for the ingest rate data, and Grafana the frontend.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://twitter.com/warkolm/&#34;&gt;Mark Walkom&lt;/a&gt; reminded me on twitter that the next release of Logstash will add more functionality in this area - and that it&amp;rsquo;ll integrate back into the Elastic stack:&lt;/p&gt;&#xA;&lt;blockquote class=&#34;twitter-tweet&#34; data-lang=&#34;en&#34;&gt;&lt;p lang=&#34;en&#34; dir=&#34;ltr&#34;&gt;&lt;a href=&#34;https://twitter.com/rmoff&#34;&gt;@rmoff&lt;/a&gt; nice, LS 5.0 will have APIs exposing metrics too. they’ll be integrated back into Marvel/Monitoring! :)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Monitoring Logstash Ingest Rates with InfluxDB and Grafana</title>
      <link>https://rmoff.net/2016/05/12/monitoring-logstash-ingest-rates-with-influxdb-and-grafana/</link>
      <pubDate>Thu, 12 May 2016 20:56:38 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/12/monitoring-logstash-ingest-rates-with-influxdb-and-grafana/</guid>
      <description>&lt;p&gt;In this article I&amp;rsquo;m going to show you how to easily monitor the rate at which Logstash is ingesting data, as well as in future articles the rate at which Elasticsearch is indexing it. It&amp;rsquo;s a nice little touch to add to any project involving Logstash, and it&amp;rsquo;s easy to do.&lt;/p&gt;&#xA;&lt;p&gt;Logstash is powerful tool for data ingest, processing, and distribution. It originated as simply the pipe to slurp at log files and put them into Elasticsearch, but has evolved into a whole bunch more. With connectors to JDBC and Kafka, as well as many other &lt;a href=&#34;https://www.elastic.co/guide/en/logstash/current/input-plugins.html&#34;&gt;input&lt;/a&gt; and &lt;a href=&#34;https://www.elastic.co/guide/en/logstash/current/output-plugins.html&#34;&gt;output&lt;/a&gt; options (not to mention the &lt;a href=&#34;https://www.elastic.co/guide/en/logstash/current/filter-plugins.html&#34;&gt;filtering&lt;/a&gt; possibilities), it really is a great bit of software to use. I&amp;rsquo;ve used it over the years with &lt;a href=&#34;http://www.rittmanmead.com/2014/10/monitoring-obiee-with-elasticsearch-logstash-and-kibana/&#34;&gt;OBIEE&lt;/a&gt;, as well as more recently to &lt;a href=&#34;https://www.elastic.co/blog/visualising-oracle-performance-data-with-the-elastic-stack&#34;&gt;pull data from Oracle&lt;/a&gt;, and even &lt;a href=&#34;https://rmoff.net/2016/03/24/my-latest-irc-client-kibana/&#34;&gt;IRC&lt;/a&gt;. Another great set of tools is &lt;a href=&#34;http://influxdb.com&#34;&gt;InfluxDB&lt;/a&gt; and &lt;a href=&#34;http://grafana.org&#34;&gt;Grafana&lt;/a&gt;, which for me really round off the standalone Elastic platform (previously known as ELK - Elasticsearch, Logstash, and Kibana). What InfluxDB and Grafana give is a powerful dedicated time series database and flexible time series-based dashboarding tool respectively. A topic for another day is the Elasticsearch vs InfluxDB overlap, and Kibana vs Grafana - but for now, just take it as read that it&amp;rsquo;s horses for course, right tool for the right job, etc.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Collection of Articles on How to Write a Good Conference Abstract</title>
      <link>https://rmoff.net/2016/05/05/collection-of-articles-on-how-to-write-a-good-conference-abstract/</link>
      <pubDate>Thu, 05 May 2016 09:57:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/05/05/collection-of-articles-on-how-to-write-a-good-conference-abstract/</guid>
      <description>&lt;p&gt;Here&amp;rsquo;s a collection of useful articles that I&amp;rsquo;ve found over the years that give good advice on writing a good abstract, mistakes to avoid, etc:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://dataeducation.com/capturing-attention-writing-great-session-descriptions/&#34;&gt;Adam Machanic - Capturing Attention: Writing Great Session Descriptions&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://blog.pythian.com/concrete-advice-for-abstract-writers/&#34;&gt;Gwen Shapira - Concrete Advice for Abstract Writers&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://dbakevlar.com/2013/10/abstracts-reviews-and-conferences-oh-my/&#34;&gt;Kellyn Pot’Vin-Gorman - Abstracts, Reviews and Conferences, Oh My!&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://alistapart.com/article/conference-proposals-that-dont-suck&#34;&gt;Russ Unger - Conference Proposals that Don’t Suck&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://mwidlake.wordpress.com/2015/04/17/tips-on-submitting-an-abstract-to-conference/&#34;&gt;Martin Widlake - Tips on Submitting an Abstract to Conference&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://bridgetkromhout.com/blog/give-actionable-takeaways/&#34;&gt;Bridget Kromhout - give actionable takeaways&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;(post photo courtesy of &lt;a href=&#34;https://unsplash.com/@calum_mac&#34;&gt;Calum MacAulay&lt;/a&gt; on &lt;a href=&#34;https://unsplash.com&#34;&gt;https://unsplash.com&lt;/a&gt;)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Using R to Denormalise Data for Analysis in Kibana</title>
      <link>https://rmoff.net/2016/04/24/using-r-to-denormalise-data-for-analysis-in-kibana/</link>
      <pubDate>Sun, 24 Apr 2016 12:22:12 +0000</pubDate>
      <guid>https://rmoff.net/2016/04/24/using-r-to-denormalise-data-for-analysis-in-kibana/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://www.elastic.co/products/kibana&#34;&gt;Kibana&lt;/a&gt; is a tool from &lt;a href=&#34;https://www.elastic.co/&#34;&gt;Elastic&lt;/a&gt; that makes analysis of data held in &lt;a href=&#34;https://www.elastic.co/products/elasticsearch&#34;&gt;Elasticsearch&lt;/a&gt; really easy and very powerful. Because Elasticsearch has very loose schema that can evolve on demand it makes it very quick to get up and running with some cool visualisations and analysis on any set of data. I demonstrated this in a &lt;a href=&#34;http://www.rittmanmead.com/2015/04/using-the-elk-stack-to-analyse-donors-choose-data/&#34;&gt;blog post last year&lt;/a&gt;, taking a CSV file and loading it into Elasticsearch via Logstash.&lt;/p&gt;&#xA;&lt;p&gt;This is all great, but the one real sticking point with analytics in Elasticsearch/Kibana is that it needs the data to be &lt;strong&gt;denormalised&lt;/strong&gt;. That is, you can&amp;rsquo;t give it a bunch of sources of data and it perform the joins for you in Kibana - it just doesn&amp;rsquo;t work like that. If you&amp;rsquo;re using Elasticsearch alone for analytics, maybe with a bespoke application, &lt;a href=&#34;https://www.elastic.co/guide/en/elasticsearch/guide/current/relations.html&#34;&gt;there are ways of approaching it&lt;/a&gt;, but not through Kibana. Now, depending on where the data is coming from, this may not be a problem. For example, if you use the &lt;a href=&#34;https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html&#34;&gt;JDBC Logstash input&lt;/a&gt; to pull from an RDBMS source you can specify a complex SQL query going across multiple tables, so that the data when it hits Elasticsearch is nice and denormalised and ready for fun in Kibana. But, source data doesn&amp;rsquo;t always come this way, and it&amp;rsquo;s useful to have a way to work with the data still when it is like this.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE security patches, and FINAL 11.1.1.7 patchset release</title>
      <link>https://rmoff.net/2016/04/18/obiee-security-patches-and-final-11.1.1.7-patchset-release/</link>
      <pubDate>Mon, 18 Apr 2016 15:36:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/04/18/obiee-security-patches-and-final-11.1.1.7-patchset-release/</guid>
      <description>&lt;p&gt;Two vulns for OBIEE in the latest critical patch update (CPU): &lt;a href=&#34;http://www.oracle.com/technetwork/security-advisory/cpuapr2016v3-2985753.html?elq_mid=45463&amp;amp;sh=91225181314122121267715271910&amp;amp;cmid=WWMK10067711MPP001C140&#34;&gt;http://www.oracle.com/technetwork/security-advisory/cpuapr2016v3-2985753.html?elq_mid=45463&amp;amp;sh=91225181314122121267715271910&amp;amp;cmid=WWMK10067711MPP001C140&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;Patches is bundle patch &lt;code&gt;.160419&lt;/code&gt;:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;12.2.1: &lt;a href=&#34;https://support.oracle.com/epmos/faces/ui/patch/PatchDetail.jspx?parent=DOCUMENT&amp;amp;sourceId=2102148.1&amp;amp;patchId=22734181&#34;&gt;https://support.oracle.com/epmos/faces/ui/patch/PatchDetail.jspx?parent=DOCUMENT&amp;amp;sourceId=2102148.1&amp;amp;patchId=22734181&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;11.1.1.9: &lt;a href=&#34;https://support.oracle.com/epmos/faces/ui/patch/PatchDetail.jspx?parent=DOCUMENT&amp;amp;sourceId=2102148.1&amp;amp;patchId=22393988&#34;&gt;https://support.oracle.com/epmos/faces/ui/patch/PatchDetail.jspx?parent=DOCUMENT&amp;amp;sourceId=2102148.1&amp;amp;patchId=22393988&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;11.1.1.7: &lt;a href=&#34;https://support.oracle.com/epmos/faces/ui/patch/PatchDetail.jspx?parent=DOCUMENT&amp;amp;sourceId=2102148.1&amp;amp;patchId=22225110&#34;&gt;https://support.oracle.com/epmos/faces/ui/patch/PatchDetail.jspx?parent=DOCUMENT&amp;amp;sourceId=2102148.1&amp;amp;patchId=22225110&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;Note that April 2016 is the &lt;strong&gt;last regular patchset&lt;/strong&gt; for 11.1.1.7, ref: &lt;a href=&#34;https://support.oracle.com/epmos/faces/DocumentDisplay?id=2102148.1#mozTocId410847&#34;&gt;https://support.oracle.com/epmos/faces/DocumentDisplay?id=2102148.1#mozTocId410847&lt;/a&gt;. If you&amp;rsquo;re still on it, or earlier, time to upgrade!&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;(Photo credit: &lt;a href=&#34;https://unsplash.com/@jenlittlebirdie&#34;&gt;https://unsplash.com/@jenlittlebirdie&lt;/a&gt;)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Streaming Data through Oracle GoldenGate to Elasticsearch</title>
      <link>https://rmoff.net/2016/04/14/streaming-data-through-oracle-goldengate-to-elasticsearch/</link>
      <pubDate>Thu, 14 Apr 2016 22:51:43 +0000</pubDate>
      <guid>https://rmoff.net/2016/04/14/streaming-data-through-oracle-goldengate-to-elasticsearch/</guid>
      <description>&lt;p&gt;Recently added to the &lt;a href=&#34;https://java.net/projects/oracledi/&#34;&gt;oracledi project over at java.net&lt;/a&gt; is &lt;a href=&#34;https://java.net/projects/oracledi/&#34;&gt;an adaptor&lt;/a&gt; enabling Oracle GoldenGate (OGG) to send data to Elasticsearch. This adds a powerful alternative to [micro-]batch extract via JDBC from Oracle to Elasticsearch, which I wrote about recently &lt;a href=&#34;https://www.elastic.co/blog/visualising-oracle-performance-data-with-the-elastic-stack&#34;&gt;over at the Elastic blog&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;Elasticsearch is a &amp;lsquo;document store&amp;rsquo; widely used for both search and analytics. It&amp;rsquo;s something I&amp;rsquo;ve written a lot about (&lt;a href=&#34;https://rmoff.net/categories/elasticsearch/&#34;&gt;here&lt;/a&gt; and &lt;a href=&#34;http://www.rittmanmead.com/tag/elasticsearch&#34;&gt;here&lt;/a&gt; for archives), as well as &lt;a href=&#34;https://talks.rmoff.net/data-discovery-and-systems-diagnostics-with-the-elk-stack/&#34;&gt;spoken about&lt;/a&gt; - preaching the good word, as it were, since the Elastic stack as a whole is very very good at what it does and a pleasure to work with. So, being able to combine that with my &amp;ldquo;day job&amp;rdquo; focus of Oracle is fun. Let&amp;rsquo;s get started!&lt;/p&gt;</description>
    </item>
    <item>
      <title>Decoupling the Data Pipeline with Kafka - A (Very) Simple Real Life Example</title>
      <link>https://rmoff.net/2016/04/12/decoupling-the-data-pipeline-with-kafka-a-very-simple-real-life-example/</link>
      <pubDate>Tue, 12 Apr 2016 21:50:46 +0000</pubDate>
      <guid>https://rmoff.net/2016/04/12/decoupling-the-data-pipeline-with-kafka-a-very-simple-real-life-example/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve recently been playing around with the ELK stack (&lt;a href=&#34;https://www.elastic.co/blog/heya-elastic-stack-and-x-pack&#34;&gt;now officially known as the Elastic stack&lt;/a&gt;) collecting data from &lt;a href=&#34;https://rmoff.net/2016/03/03/obihackers-irc-channel/&#34;&gt;an IRC channel&lt;/a&gt; with Elastic&amp;rsquo;s Logstash, storing it in Elasticsearch and &lt;a href=&#34;https://rmoff.net/2016/03/24/my-latest-irc-client-kibana/&#34;&gt;analysing it with Kibana&lt;/a&gt;. But, this isn&amp;rsquo;t an &amp;ldquo;ELK&amp;rdquo; post - this is a Kafka post! ELK is just some example data manipulation tooling that helps demonstrate the principles.&lt;/p&gt;&#xA;&lt;p&gt;As I &lt;a href=&#34;http://www.rittmanmead.com/2015/10/forays-into-kafka-enabling-flexible-data-pipelines/&#34;&gt;wrote about last year&lt;/a&gt;, Apache Kafka provides a handy way to build flexible &amp;ldquo;pipelines&amp;rdquo;. Today I&amp;rsquo;m writing up a short real-world example of this in practice. There are three elements to the flexibility that I really want to highlight:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Food Pr0n 02 - Devon &amp; Dorset</title>
      <link>https://rmoff.net/2016/04/11/food-pr0n-02-devon-dorset/</link>
      <pubDate>Mon, 11 Apr 2016 19:00:56 +0000</pubDate>
      <guid>https://rmoff.net/2016/04/11/food-pr0n-02-devon-dorset/</guid>
      <description>&lt;p&gt;On a family holiday in South Devon last week I had some good food experiences. Top of the pile was a &lt;strong&gt;&lt;a href=&#34;https://en.m.wikipedia.org/wiki/Lardy_cake&#34;&gt;Lardy Cake&lt;/a&gt;&lt;/strong&gt;, a delicacy new to me but which I&amp;rsquo;ll be sure to be searching out again. It reminded me of an Eccles cake, but bigger and lardier!&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2016/04/thumb_IMG_6953_1024.jpg&#34; alt=&#34;Lardy Cake&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;I got mine from &lt;a href=&#34;https://maps.google.com/?q=Vinnicombes%20Ltd%2C%2060%20High%20St%2C%20Sidmouth%2C%20Devon%20EX10%208EH&amp;amp;ftid=0x486d9ce356de16b5:0xd613601c4557c408&amp;amp;hl=en-GB&amp;amp;gl=uk&#34;&gt;Vinnicombes&lt;/a&gt; on the High Street in Sidmouth.&lt;/p&gt;&#xA;&lt;p&gt;Just down the coast from Sidmouth is a village called &lt;a href=&#34;https://maps.google.com/?q=Beer%2C%20Devon&amp;amp;ftid=0x486d8141cbe68345:0xc5a6d16d3c454ca3&amp;amp;hl=en-GB&amp;amp;gl=uk&#34;&gt;Beer&lt;/a&gt;. I didn&amp;rsquo;t have any food of note there, but it&amp;rsquo;s called Beer which is certainly worth recording :)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Experiments with Kibana Timelion</title>
      <link>https://rmoff.net/2016/03/29/experiments-with-kibana-timelion/</link>
      <pubDate>Tue, 29 Mar 2016 21:07:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/03/29/experiments-with-kibana-timelion/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://www.elastic.co/blog/timelion-timeline&#34;&gt;Timelion&lt;/a&gt; was released in November 2015 and with the 4.4.2 release of &lt;a href=&#34;https://www.elastic.co/products/kibana&#34;&gt;Kibana&lt;/a&gt; is available as a native visualisation once installed. It adds some powerful capabilities to Kibana as an timeseries analysis tool, using its own data manipulation language.&lt;/p&gt;&#xA;&lt;p&gt;Installing Timelion is a piece of cake:&lt;/p&gt;&#xA;&lt;pre&gt;&lt;code&gt;./bin/kibana plugin -i kibana/timelion&#xA;&lt;/code&gt;&lt;/pre&gt;&#xA;&lt;p&gt;After restarting Kibana, you&amp;rsquo;ll see it as an option from the application picker&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2016/03/2016-03-29_23-13-49.png&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;There&amp;rsquo;s a bit of a learning curve with Timelion, but it&amp;rsquo;s worth it. &lt;a href=&#34;https://www.elastic.co/blog/timelion-timeline&#34;&gt;The blog&lt;/a&gt; gives some basics, and the built-in help is really good too:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Connecting to OBIEE via JDBC - with jisql</title>
      <link>https://rmoff.net/2016/03/28/connecting-to-obiee-via-jdbc-with-jisql/</link>
      <pubDate>Mon, 28 Mar 2016 21:01:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/03/28/connecting-to-obiee-via-jdbc-with-jisql/</guid>
      <description>&lt;p&gt;OBIEE supports JDBC as a connection protocol, using the driver available on all installations of OBIEE, &lt;a href=&#34;https://docs.oracle.com/middleware/11119/biee/BIEIT/odbc_data_source.htm#BIEIT1738&#34;&gt;bijdbc.jar&lt;/a&gt;. This makes connecting to OBIEE from custom or third-party applications very easy. Once connected, you issue &amp;ldquo;Logical SQL&amp;rdquo; against the &amp;ldquo;tables&amp;rdquo; of the Presentation Layer. An example of logical SQL is:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-sql&#34; data-lang=&#34;sql&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;SELECT&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;Time&amp;#34;&lt;/span&gt;.&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;T05 Per Name Year&amp;#34;&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;saw_0&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;FROM&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#ba2121&#34;&gt;&amp;#34;A - Sample Sales&amp;#34;&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;To find more Logical SQL simply inspect your nqquery.log (obis-query.log in 12c), or Usage Tracking.&lt;/p&gt;</description>
    </item>
    <item>
      <title>My latest IRC client : Kibana</title>
      <link>https://rmoff.net/2016/03/24/my-latest-irc-client-kibana/</link>
      <pubDate>Thu, 24 Mar 2016 21:38:02 +0000</pubDate>
      <guid>https://rmoff.net/2016/03/24/my-latest-irc-client-kibana/</guid>
      <description>&lt;p&gt;OK, maybe that&amp;rsquo;s not entirely true. But my &lt;em&gt;read-only&lt;/em&gt; client, certainly.&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2016/03/2016-03-24_21-15-30.png&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;I was perusing the &lt;a href=&#34;https://www.elastic.co/guide/en/logstash/current/input-plugins.html&#34;&gt;Logstash input plugins&lt;/a&gt; recently when I noticed that there was one for &lt;a href=&#34;https://www.elastic.co/guide/en/logstash/current/plugins-inputs-irc.html&#34;&gt;IRC&lt;/a&gt;. Being a fan of IRC and a regular on the &lt;a href=&#34;https://rmoff.net/2016/03/03/obihackers-irc-channel/&#34;&gt;#obihackers&lt;/a&gt; channel, I thought this could be fun and yet another great example of how easy &lt;a href=&#34;http://elastic.co&#34;&gt;the Elastic stack&lt;/a&gt; is to work with.&lt;/p&gt;&#xA;&lt;p&gt;Installation is a piece of cake:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-shell&#34; data-lang=&#34;shell&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;wget https://download.elasticsearch.org/elasticsearch/release/org/elasticsearch/distribution/zip/elasticsearch/2.2.1/elasticsearch-2.2.1.zip&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;wget https://download.elastic.co/logstash/logstash/logstash-2.2.2.zip&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;wget https://download.elastic.co/kibana/kibana/kibana-4.4.2-linux-x64.tar.gz&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;unzip &lt;span style=&#34;color:#b62;font-weight:bold&#34;&gt;\*&lt;/span&gt;.zip&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;tar -xf kibana-4.4.2-linux-x64.tar.gz&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;sudo mv elasticsearch-2.2.1 logstash-2.2.2 kibana-4.4.2-linux-x64 /opt&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;(you&amp;rsquo;ll also need Oracle JDK installed if not already, &lt;a href=&#34;http://www.jamescoyle.net/how-to/1897-download-oracle-java-from-the-terminal-with-wget&#34;&gt;here&amp;rsquo;s a handy way to get it from the CLI&lt;/a&gt;).&lt;/p&gt;</description>
    </item>
    <item>
      <title>Food Pr0n - 01</title>
      <link>https://rmoff.net/2016/03/19/food-pr0n-01/</link>
      <pubDate>Sat, 19 Mar 2016 21:18:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/03/19/food-pr0n-01/</guid>
      <description>&lt;p&gt;One of the perks of my job is that I get to travel to some pretty nice places (hi, San Francisco, Bergen, Åland Islands), and get to eat some pretty good food too. If you&amp;rsquo;re looking for some techie content, then move along and go and read about &lt;a href=&#34;https://rmoff.net/2016/03/16/fun-and-games-with-oracle-goldengate-kafka-and-logstash-on-bigdatalite-4.4/&#34;&gt;Kafka&lt;/a&gt;, but if you enjoy food pr0n then stay put.&lt;/p&gt;&#xA;&lt;p&gt;I was working for a client in the centre of Manchester this week, staying as usual at a Premier Inn. I&amp;rsquo;m a big fan of my &lt;a href=&#34;https://rmoff.net/2016/02/26/what-makes-a-good-full-english/&#34;&gt;fried breakfasts&lt;/a&gt;, and whilst never anything to write home about (or indeed write a blog about) Premier Inn breakfasts are generally fine for what they are &amp;ndash; but this one was a real disappointment. Even with the excitement of seeing &lt;a href=&#34;https://en.wikipedia.org/wiki/Bubble_and_squeak&#34;&gt;bubble and squeak&lt;/a&gt;, it was a real let down; luke warm, and the sausages in particular tasted really artificially flavoured.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE 11.1.1.9 installation - JPS-06514: Opening of file based keystore failed</title>
      <link>https://rmoff.net/2016/03/18/obiee-11.1.1.9-installation-jps-06514-opening-of-file-based-keystore-failed/</link>
      <pubDate>Fri, 18 Mar 2016 18:04:07 +0000</pubDate>
      <guid>https://rmoff.net/2016/03/18/obiee-11.1.1.9-installation-jps-06514-opening-of-file-based-keystore-failed/</guid>
      <description>&lt;p&gt;I got this lovely failure &lt;strong&gt;during a fresh install&lt;/strong&gt; of OBIEE 11.1.1.9. I emphasise that it was during the install because there&amp;rsquo;s other causes for this error &lt;strong&gt;on an existing system&lt;/strong&gt; to do with corrupted credential stores etc &amp;ndash; not the case here.&lt;/p&gt;&#xA;&lt;p&gt;The install had copied in the binaries and was in the process of building the domain. During the early stages of this where it starts configuring and restarting the AdminServer it failed, with the AdminServer.log showing the following: (I&amp;rsquo;ve extracted the salient errors from the log)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Fun and Games with Oracle GoldenGate, Kafka, and Logstash on BigDataLite 4.4</title>
      <link>https://rmoff.net/2016/03/16/fun-and-games-with-oracle-goldengate-kafka-and-logstash-on-bigdatalite-4.4/</link>
      <pubDate>Wed, 16 Mar 2016 22:01:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/03/16/fun-and-games-with-oracle-goldengate-kafka-and-logstash-on-bigdatalite-4.4/</guid>
      <description>&lt;p&gt;The Oracle by Example (ObE) &lt;a href=&#34;http://www.oracle.com/webfolder/technetwork/tutorials/obe/fmw/odi/odi_12c/DI_BDL_Guide/BigDataIntegration_Demo.html?cid=10235&amp;amp;ssid=0&#34;&gt;here&lt;/a&gt; demonstrating how to use &lt;a href=&#34;https://docs.oracle.com/goldengate/bd1221/gg-bd/GBDIN/intro_adapter.htm#GBDIN101&#34;&gt;Goldengate to replicate transactions big data targets&lt;/a&gt; such as HDFS is written for the BigDataLite &lt;a href=&#34;http://www.oracle.com/technetwork/database/bigdata-appliance/oracle-bigdatalite421-2843803.html&#34;&gt;4.2.1&lt;/a&gt;, and for me didn&amp;rsquo;t work on the current latest version, &lt;a href=&#34;http://www.oracle.com/technetwork/database/bigdata-appliance/oracle-bigdatalite-2104726.html&#34;&gt;4.4.0&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;The OBE (and similar &lt;a href=&#34;http://www.oracle.com/webfolder/technetwork/odi/ODI_BigData_HOL.pdf&#34;&gt;Hands On Lab&lt;/a&gt; PDF) assume the presence of &lt;code&gt;pmov.prm&lt;/code&gt; and &lt;code&gt;pmov.properties&lt;/code&gt; in &lt;code&gt;/u01/ogg/dirprm/&lt;/code&gt;. On BDL 4.4 there&amp;rsquo;s only the extract to from Oracle configuration, &lt;code&gt;emov&lt;/code&gt;.&lt;/p&gt;&#xA;&lt;p&gt;Fortunately it&amp;rsquo;s still possible to run this setup out of the box in BDL 4.4, with bells on because it includes &lt;a href=&#34;http://kafka.apache.org/&#34;&gt;Kafka&lt;/a&gt; too. And, who doesn&amp;rsquo;t like a bit of Kafka nowadays?&lt;/p&gt;</description>
    </item>
    <item>
      <title>Presentation Slides… bye-bye Slideshare, hello Speakerdeck</title>
      <link>https://rmoff.net/2016/03/09/presentation-slides-bye-bye-slideshare-hello-speakerdeck/</link>
      <pubDate>Wed, 09 Mar 2016 09:43:30 +0000</pubDate>
      <guid>https://rmoff.net/2016/03/09/presentation-slides-bye-bye-slideshare-hello-speakerdeck/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve always defaulted to &lt;a href=&#34;https://talks.rmoff.net/&#34;&gt;Slideshare&lt;/a&gt; for hosting slides from presentations that I&amp;rsquo;ve given, but it&amp;rsquo;s become more and more crap-infested. The UI is messy, and the UX sucks - for example, I want to download a slide deck, I most definitely 100% am not interested in &amp;ldquo;clipping&amp;rdquo; it&amp;hellip;even if you ask me every. damn. time:&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2016/03/2016-03-09_09-32-29-1.png&#34; alt=&#34;Nope!&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;Looking around it seems the other popular option is &lt;a href=&#34;http://speakerdeck.com&#34;&gt;Speakerdeck&lt;/a&gt;. The UI is clean and simple, and I as both a user and uploader I feel like I&amp;rsquo;m there to read and share slides rather than be monetised as an eyeball on the site.&lt;/p&gt;</description>
    </item>
    <item>
      <title>obihackers IRC channel</title>
      <link>https://rmoff.net/2016/03/03/obihackers-irc-channel/</link>
      <pubDate>Thu, 03 Mar 2016 22:55:37 +0000</pubDate>
      <guid>https://rmoff.net/2016/03/03/obihackers-irc-channel/</guid>
      <description>&lt;h3 id=&#34;obihackers&#34;&gt;&lt;code&gt;#obihackers&lt;/code&gt;&lt;/h3&gt;&#xA;&lt;p&gt;There&amp;rsquo;s a &lt;code&gt;#obihackers&lt;/code&gt; IRC channel on freenode, where a dozen or so of us have hung out for several years now. Chat is usually OBIEE, Oracle, ODI, and general geek out.&lt;/p&gt;&#xA;&lt;p&gt;Bear in mind this is the equivalent of us hanging out in a bar; if you wanna shoot the shit with a geeky question about OBIEE go ahead, but if you&amp;rsquo;ve come to get help with your homework without even buying a round, you&amp;rsquo;ll probably get short shrift&amp;hellip; ;-)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Streaming data to InfluxDB from any bash command</title>
      <link>https://rmoff.net/2016/02/27/streaming-data-to-influxdb-from-any-bash-command/</link>
      <pubDate>Sat, 27 Feb 2016 21:05:00 +0000</pubDate>
      <guid>https://rmoff.net/2016/02/27/streaming-data-to-influxdb-from-any-bash-command/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://influxdata.com/time-series-platform/influxdb/&#34;&gt;InfluxDB&lt;/a&gt; is a great time series database, that&amp;rsquo;s recently been rebranded as part of the &amp;ldquo;&lt;a href=&#34;https://influxdata.com/&#34;&gt;TICK&lt;/a&gt;&amp;rdquo; stack, including data collectors, visualisation, and ETL/Alerting. I&amp;rsquo;ve yet to really look at the other components, but InfluxDB alone works just great with my favourite visualisation/analysis tool for time series metrics, &lt;a href=&#34;http://grafana.org/&#34;&gt;Grafana&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;Getting data into InfluxDB is easy, with many tools supporting the native InfluxDB &lt;a href=&#34;https://docs.influxdata.com/influxdb/v0.10/guides/writing_data/&#34;&gt;line input protocol&lt;/a&gt;, and those that don&amp;rsquo;t often supporting the &lt;a href=&#34;https://docs.influxdata.com/influxdb/v0.10/write_protocols/graphite/&#34;&gt;carbon protocol&lt;/a&gt; (from Graphite), which InfluxDB also supports (&lt;a href=&#34;https://docs.influxdata.com/influxdb/v0.10/write_protocols/&#34;&gt;along with others&lt;/a&gt;). So for collecting broad ranges of OS stats, for example, &lt;a href=&#34;http://collectl.sourceforge.net/&#34;&gt;collectl&lt;/a&gt; via carbon and nmon via &lt;a href=&#34;https://github.com/adejoux/nmon2influxdb&#34;&gt;nmon2influxdb&lt;/a&gt; are both viable options.&lt;/p&gt;</description>
    </item>
    <item>
      <title>What makes a good Full English?</title>
      <link>https://rmoff.net/2016/02/26/what-makes-a-good-full-english/</link>
      <pubDate>Fri, 26 Feb 2016 18:02:31 +0000</pubDate>
      <guid>https://rmoff.net/2016/02/26/what-makes-a-good-full-english/</guid>
      <description>&lt;p&gt;Thanks to the power of twitter, I can look back on all the many and varied Full English breakfasts that I&amp;rsquo;ve (mostly) enjoyed:&lt;/p&gt;&#xA;&lt;p&gt;👉🏻 &lt;a href=&#34;https://twitter.com/search?q=rmoff%20%23fullenglish&amp;amp;src=typd&#34;&gt;https://twitter.com/search?q=rmoff%20%23fullenglish&amp;amp;src=typd&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/2016/02/fullenglish.jpg&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;What makes a good Full English?&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Good ingredients, cooked well. Nothing worse than a limp pink sausage, as it were.&lt;/li&gt;&#xA;&lt;li&gt;Sausage, standard pork or Cumberland at most. Definitely no daft apricot and guava bean with a hint of foie gras nonsense. Must be cooked right well, crispy skin, almost burnt.&lt;/li&gt;&#xA;&lt;li&gt;Bacon, starting to crisp (but not deep fried and crisp-like as the Americans do it - it should bend if you pick it up)&lt;/li&gt;&#xA;&lt;li&gt;Black Pudding, seriously. Grim idea when you think about it (blood sausage) but so very tasty, and binds the other components together&lt;/li&gt;&#xA;&lt;li&gt;Fried eggs with runny yolks&lt;/li&gt;&#xA;&lt;li&gt;Scrambled eggs are making an appearance more recently too, after 30 years of not really caring for them. The dodgy &amp;ldquo;scrambled egg&amp;rdquo; that you get in some hotels I think straight from a packet mix are crap though and best avoided.&lt;/li&gt;&#xA;&lt;li&gt;Hash browns. Controversial, and often a bit greasy - but I&amp;rsquo;d never pass one by&lt;/li&gt;&#xA;&lt;li&gt;Fried slice - sadly all too often missing. I&amp;rsquo;ll forego hash browns if there&amp;rsquo;s a fried slice&lt;/li&gt;&#xA;&lt;li&gt;Tomatoes, but only if there&amp;rsquo;s more tomato than stalk as often happens&lt;/li&gt;&#xA;&lt;li&gt;Mushrooms, thicky chopped so there&amp;rsquo;s something to it - and certainly not the wussy sliced things that you get sitting in a pool of black greasy water at some fryup buffets&lt;/li&gt;&#xA;&lt;li&gt;Baked beans&lt;/li&gt;&#xA;&lt;li&gt;Good white toast&lt;/li&gt;&#xA;&lt;li&gt;&lt;strong&gt;HP Sauce&lt;/strong&gt; - although I&amp;rsquo;ve realised recently that a really good fryup doesn&amp;rsquo;t actually need any sauce. But if it&amp;rsquo;s subpar and/or cheap ingrediants &amp;ndash; gotta have HP sauce (and definitely not that awful &amp;ldquo;brown sauce&amp;rdquo; that you get in sachets and tastes grim).&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>Visualising OBIEE DMS Metrics over the years</title>
      <link>https://rmoff.net/2016/02/26/visualising-obiee-dms-metrics-over-the-years/</link>
      <pubDate>Fri, 26 Feb 2016 17:54:54 +0000</pubDate>
      <guid>https://rmoff.net/2016/02/26/visualising-obiee-dms-metrics-over-the-years/</guid>
      <description>&lt;p&gt;It struck me today when I was writing my most recent blog over at &lt;a href=&#34;http://ritt.md/obi-dms&#34;&gt;Rittman Mead&lt;/a&gt; that I&amp;rsquo;ve been playing with visualising OBIEE metrics for &lt;em&gt;years&lt;/em&gt; now.&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Back in 2009 I wrote about using something called JManage to pull metrics out of OBIEE 10g via JMX:&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/jmanage08.png?w=900&amp;amp;h=760&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Still with OBIEE 10g in 2011, I was using rrdtool and some &lt;a href=&#34;https://rmoff.net/2010/12/06/collecting-obiee-systems-management-data-with-jmx/&#34;&gt;horrible-looking tcl hacking&lt;/a&gt; to get the metrics out through jmx :&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/graph.png?w=2048&amp;amp;h=542&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;2014 brought with it DMS and my first forays with Graphite for storing &amp;amp; visualising data:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Blogging</title>
      <link>https://rmoff.net/2011/11/28/blogging/</link>
      <pubDate>Mon, 28 Nov 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/11/28/blogging/</guid>
      <description>&lt;p&gt;I will now be blogging mostly over at the venerable blog of my employer, &lt;a href=&#34;http://www.rittmanmead.com&#34;&gt;Rittman Mead&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;You can see my first posting here: &lt;a href=&#34;http://www.rittmanmead.com/2011/11/web-services-in-bi-publisher-11g/&#34;&gt;Web Services in BI Publisher 11g&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;Don&amp;rsquo;t entirely exclude this rnm1978 blog from your feeds, as I may still post more esoteric and random tidbits here.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Instrumenting OBIEE - the final chapter</title>
      <link>https://rmoff.net/2011/10/10/instrumenting-obiee-the-final-chapter/</link>
      <pubDate>Mon, 10 Oct 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/10/10/instrumenting-obiee-the-final-chapter/</guid>
      <description>&lt;p&gt; &lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt; &lt;/p&gt;&#xA;&lt;h3 id=&#34;this-article-has-been-superseded-by-a-newer-version&#34;&gt;&lt;em&gt;&lt;strong&gt;This article has been superseded by a newer version: &lt;a href=&#34;http://www.rittmanmead.com/2015/03/instrumenting-obiee-database-connections-for-improved-performance-diagnostics/&#34;&gt;Instrumenting OBIEE Database Connections For Improved Performance Diagnostics&lt;/a&gt;&lt;/strong&gt;&lt;/em&gt;&lt;/h3&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt; &lt;/p&gt;&#xA;&lt;p&gt;(&lt;em&gt;Previously on this blog: &lt;a href=&#34;https://rmoff.net/2010/01/26/identify-your-obiee-users-by-setting-client-id-in-oracle-connection/&#34;&gt;1&lt;/a&gt;, &lt;a href=&#34;https://rmoff.net/2011/02/02/instrumenting-obiee-for-tracing-oracle-db-calls/&#34;&gt;2&lt;/a&gt;, &lt;a href=&#34;https://rmoff.net/2011/08/08/have-you-defined-client_id-in-obiee-yet/&#34;&gt;3&lt;/a&gt;&amp;hellip;&lt;/em&gt;)&lt;/p&gt;&#xA;&lt;h2 id=&#34;summary&#34;&gt;Summary&lt;/h2&gt;&#xA;&lt;p&gt;Instrument your code. Stop guessing. Make your DBA happy. Make your life as a BI Admin easier.&lt;/p&gt;&#xA;&lt;h2 id=&#34;the-problem&#34;&gt;The Problem&lt;/h2&gt;&#xA;&lt;p&gt;OBIEE will typically connect to the database using a generic application account. (Hopefully, you&amp;rsquo;ll have isolated it to an account used only for this purpose - if you haven&amp;rsquo;t, you should.)&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBI 11g : UPGAST-00055: error reading the Oracle Universal Installer inventory</title>
      <link>https://rmoff.net/2011/10/05/obi-11g-upgast-00055-error-reading-the-oracle-universal-installer-inventory/</link>
      <pubDate>Wed, 05 Oct 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/10/05/obi-11g-upgast-00055-error-reading-the-oracle-universal-installer-inventory/</guid>
      <description>&lt;p&gt;It&amp;rsquo;s not my fault really.&lt;/p&gt;&#xA;&lt;p&gt;When running an installation, presented with the option of&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;(a) do a bunch of stuff and wait to continue the install later or&lt;/li&gt;&#xA;&lt;li&gt;(b) tick a box and continue now&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;it&amp;rsquo;s a better man that I who would opt for option (a).&lt;/p&gt;&#xA;&lt;p&gt;When I recently installed OBIEE 11g, I was prompted to get a script run as root to set up the inventory, or tick &amp;ldquo;Continue Installation with local inventory&amp;rdquo; to continue with the install.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Oracle - tnsping - Message 3513 not found;  product=NETWORK; facility=TNS</title>
      <link>https://rmoff.net/2011/09/26/oracle-tnsping-message-3513-not-found-productnetwork-facilitytns/</link>
      <pubDate>Mon, 26 Sep 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/09/26/oracle-tnsping-message-3513-not-found-productnetwork-facilitytns/</guid>
      <description>&lt;p&gt;Short note to record this, as Google drew no hits on it.&lt;/p&gt;&#xA;&lt;p&gt;Windows XP machine with existing Oracle 11.1 client installation, all working fine.&lt;/p&gt;&#xA;&lt;p&gt;Installed Oracle 11.2 XE, and started getting these errors:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;C:\Windows\System32&amp;gt;tnsping DBNAME&#xD;&#xA;&#xD;&#xA;TNS Ping Utility for 32-bit Windows: Version 11.2.0.2.0 - Production on 26-SEP-2011 11:01:11&#xD;&#xA;&#xD;&#xA;Copyright (c) 1997, 2010, Oracle.  All rights reserved.&#xD;&#xA;&#xD;&#xA;Used parameter files:&#xD;&#xA;C:\app\userid\product\11.1.0\client_1\network\admin\sqlnet.ora&#xD;&#xA;&#xD;&#xA;&#xD;&#xA;Used TNSNAMES adapter to resolve the alias&#xD;&#xA;Message 3513 not found;  product=NETWORK; facility=TNS&#xD;&#xA;OK (20 msec)&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Also got these errors from a previously-functioning ODBC query in Excel when I tried to refresh it:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Sourcecode markup tweaks in Wordpress</title>
      <link>https://rmoff.net/2011/09/26/sourcecode-markup-tweaks-in-wordpress/</link>
      <pubDate>Mon, 26 Sep 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/09/26/sourcecode-markup-tweaks-in-wordpress/</guid>
      <description>&lt;p&gt;I noticed in &lt;a href=&#34;http://edstevensdba.wordpress.com/2011/02/16/sqlnet_client_cfg/&#34;&gt;Ed Stevens&amp;rsquo;&lt;/a&gt; blog posting here that some sourcecode he&amp;rsquo;d posted had certain lines highlighted.&lt;/p&gt;&#xA;&lt;p&gt;Wordpress provides the sourcecode tag for marking up sourcecode in blog posts. For example:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;cd /some/random/folder ls -l # do not run this next line!&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;is much better presented as:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000&#34;&gt;cd&lt;/span&gt; /some/random/folder ls -l &lt;span style=&#34;color:#408080;font-style:italic&#34;&gt;# do not run this next line!&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;by wrapping it in&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;tags&#xA;&#xA;I&amp;#39;ve known about the language=&amp;#39;xx&amp;#39; attribute that you can use with the tag, but Ed&amp;#39;s posting prompted me to check on the syntax and it turns out there a few tweaks one can use. Some of them are illustrated below. The list is taken from [Wordpress&amp;#39; Posting Source Code](http://en.support.wordpress.com/code/posting-source-code/) reference page.&#xA;&#xA;## [sourcecode] ....&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;#REF: http://en.support.wordpress.com/code/posting-source-code/ # # This is some dummy source code to illustrate sourcecode posting on wordpress # Line 2 # Line 3 cd /some/random/folder ls -l # do not run this next line! rm -rf /some&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;h2&gt;&lt;/h2&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;....&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#408080;font-style:italic&#34;&gt;#REF: http://en.support.wordpress.com/code/posting-source-code/ # # This is some dummy source code to illustrate sourcecode posting on wordpress # Line 2 # Line 3 cd /some/random/folder ls -l # do not run this next line! rm -rf /some&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;h2&gt;&lt;/h2&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;....&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;#REF: http://en.support.wordpress.com/code/posting-source-code/ # # This is some dummy source code to illustrate sourcecode posting on wordpress # Line 2 # Line 3 cd /some/random/folder ls -l # do not run this next line! rm -rf /some&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;h2&gt;&lt;/h2&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;....&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;#REF: http://en.support.wordpress.com/code/posting-source-code/ # # This is some dummy source code to illustrate sourcecode posting on wordpress # Line 2 # Line 3 cd /some/random/folder ls -l # do not run this next line! rm -rf /some&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;h2&gt;&lt;/h2&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;....&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;#REF: http://en.support.wordpress.com/code/posting-source-code/ # # This is some dummy source code to illustrate sourcecode posting on wordpress # Line 2 # Line 3 cd /some/random/folder ls -l # do not run this next line! rm -rf /some&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;h2&gt;&lt;/h2&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;....&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;#REF: http://en.support.wordpress.com/code/posting-source-code/ # # This is some dummy source code to illustrate sourcecode posting on wordpress # Line 2 # Line 3 cd /some/random/folder ls -l # do not run this next line! rm -rf /some&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;h2&gt;&lt;/h2&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;....&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;#REF: http://en.support.wordpress.com/code/posting-source-code/ # # This is some dummy source code to illustrate sourcecode posting on wordpress # Line 2 # Line 3 cd /some/random/folder ls -l # do not run this next line! rm -rf /some&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;h2&gt;&lt;/h2&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;....&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;#REF: http://en.support.wordpress.com/code/posting-source-code/ # # This is some dummy source code to illustrate sourcecode posting on wordpress # Line 2 # Line 3 cd /some/random/folder ls -l # do not run this next line! rm -rf /some&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;h2&gt;&lt;/h2&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;....&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;#REF: http://en.support.wordpress.com/code/posting-source-code/ # # This is some dummy source code to illustrate sourcecode posting on wordpress # Line 2 # Line 3 cd /some/random/folder ls -l # do not run this next line! rm -rf /some&#xA;&lt;/code&gt;&lt;/pre&gt;</description>
    </item>
    <item>
      <title>Friday miscellany</title>
      <link>https://rmoff.net/2011/09/16/friday-miscellany/</link>
      <pubDate>Fri, 16 Sep 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/09/16/friday-miscellany/</guid>
      <description>&lt;ul&gt;&#xA;&lt;li&gt;&lt;strong&gt;If you only read one blog post this month&lt;/strong&gt;, read &lt;a href=&#34;http://jamesmorle.wordpress.com/2011/09/16/right-practice/&#34;&gt;James Morle&amp;rsquo;s eloquent attack on the term &amp;ldquo;Best Practice&amp;rdquo;&lt;/a&gt;.&lt;/li&gt;&#xA;&lt;li&gt;I&amp;rsquo;m very excited to be joining &lt;a href=&#34;http://www.rittmanmead.com&#34;&gt;RittmanMead&lt;/a&gt; next month! I&amp;rsquo;m looking forward to working with some of the industry&amp;rsquo;s most respected experts.&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>DBMS_STATS - GATHER AUTO</title>
      <link>https://rmoff.net/2011/09/13/dbms_stats-gather-auto/</link>
      <pubDate>Tue, 13 Sep 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/09/13/dbms_stats-gather-auto/</guid>
      <description>&lt;p&gt;In Oracle 11g, the &lt;a href=&#34;http://download.oracle.com/docs/cd/B28359_01/appdev.111/b28419/d_stats.htm&#34;&gt;DBMS_STATS&lt;/a&gt; procedure &lt;a href=&#34;http://download.oracle.com/docs/cd/B28359_01/appdev.111/b28419/d_stats.htm#BEIBJJHC&#34;&gt;GATHER_SCHEMA_STATS&lt;/a&gt; takes a parameter &amp;lsquo;options&amp;rsquo; which defines the scope of the objects processed by the procedure call, as well as the action. It can be either GATHER or LIST (gather the stats, or list out the objects to be touched, respectively), and AUTO, STALE or EMPTY (defining the object selection to process).&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;strong&gt;GATHER&lt;/strong&gt; on its own will gather stats on all objects in the schema&lt;/li&gt;&#xA;&lt;li&gt;&lt;strong&gt;GATHER EMPTY / LIST EMPTY&lt;/strong&gt; is self-explanatory - objects with no statistics.&lt;/li&gt;&#xA;&lt;li&gt;&lt;strong&gt;GATHER STALE / LIST STALE&lt;/strong&gt; is pretty obvious too - objects that have stale statistics (i.e. have had 10% change to them since statistics were last gathered). NB this 10% can be changed at an object/schema/DB level.&lt;/li&gt;&#xA;&lt;li&gt;However, the documentation is ambiguous as to the precise function of &lt;strong&gt;GATHER AUTO / LIST AUTO&lt;/strong&gt;.&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;There&amp;rsquo;s even a MOS note, &lt;a href=&#34;https://supporthtml.oracle.com/ep/faces/secure/km/DocumentDisplay.jspx?id=228186.1&#34;&gt;&amp;ldquo;Differences between GATHER STALE and GATHER AUTO (Doc ID 228186.1)&amp;rdquo;&lt;/a&gt;, which strangely enough - given the precision of its title - doesn&amp;rsquo;t really explain the difference.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Using preupgrade to upgrade Fedora 14 to Fedora 15 - proxy errors</title>
      <link>https://rmoff.net/2011/09/12/using-preupgrade-to-upgrade-fedora-14-to-fedora-15-proxy-errors/</link>
      <pubDate>Mon, 12 Sep 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/09/12/using-preupgrade-to-upgrade-fedora-14-to-fedora-15-proxy-errors/</guid>
      <description>&lt;p&gt;When using preupgrade to upgrade an existing Fedora 14 installation to Fedora 15, the following two errors were encountered:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Failed to fetch release info&lt;/li&gt;&#xA;&lt;li&gt;No groups available in any repository&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;The box sits on a network behind a proxy out to the web.&lt;/p&gt;&#xA;&lt;p&gt;The resolution was to make sure that environment variables &lt;strong&gt;http_proxy&lt;/strong&gt; and &lt;strong&gt;https_proxy&lt;/strong&gt; are set:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000&#34;&gt;export&lt;/span&gt; &lt;span style=&#34;color:#19177c&#34;&gt;http_proxy&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;http://user:password@proxyserver:port&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000&#34;&gt;export&lt;/span&gt; &lt;span style=&#34;color:#19177c&#34;&gt;https_proxy&lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;=&lt;/span&gt;http://user:password@proxyserver:port&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;Make sure you do this from the user from which you run preupgrade.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Labelling Time axes in Excel</title>
      <link>https://rmoff.net/2011/09/08/labelling-time-axes-in-excel/</link>
      <pubDate>Thu, 08 Sep 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/09/08/labelling-time-axes-in-excel/</guid>
      <description>&lt;p&gt;Excel may send chills down the spine of us when we hear users talking about its [ab]use, but it has its place in the toolset. For my money, it is a very good tool for knocking out graphs which look decent. Of course, rrdtool is my geek tool of choice for dynamic long-term graphing, but when doing scratch PoC work, I normally fall back to Excel.&lt;/p&gt;&#xA;&lt;p&gt;One thing which has frustrated me over time is, well, time, and Excel&amp;rsquo;s handling thereof. How many times (these puns are getting tiresome already) have you seen an axis like this and gnashed your teeth? &lt;img src=&#34;https://rmoff.net/images/rnm1978/2011-09-08_0929_-0000.png&#34; alt=&#34;&#34; title=&#34;2011-09-08_0929_ 0000&#34;&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>A quotation to print out and stick on your wall</title>
      <link>https://rmoff.net/2011/08/18/a-quotation-to-print-out-and-stick-on-your-wall/</link>
      <pubDate>Thu, 18 Aug 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/08/18/a-quotation-to-print-out-and-stick-on-your-wall/</guid>
      <description>&lt;p&gt;Here&amp;rsquo;s a quotation that I&amp;rsquo;ve just read and wanted to share. It is all part of a &lt;a href=&#34;http://www.battleagainstanyguess.com/&#34;&gt;BAAG&lt;/a&gt; approach to troubleshooting problems, performance in particular.&lt;/p&gt;&#xA;&lt;p&gt;From Greg Rahn (&lt;a href=&#34;http://structureddata.org/&#34;&gt;web&lt;/a&gt;|&lt;a href=&#34;http://twitter.com/#!/gregrahn&#34;&gt;twitter&lt;/a&gt;) on oracle-l:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;There are always exceptions, but exceptions can be justified and supported with data. Just beware of the the silver bullet syndrome&amp;hellip;&lt;/p&gt;&#xA;&lt;p&gt;The unfortunate part [&amp;hellip;] is that rarely anyone goes back and does the root cause analysis. It tends to fall into the bucket of &amp;ldquo;problem&amp;hellip;solved&amp;rdquo;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>New blog from Oracle - OBI Product Assurance</title>
      <link>https://rmoff.net/2011/08/15/new-blog-from-oracle-obi-product-assurance/</link>
      <pubDate>Mon, 15 Aug 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/08/15/new-blog-from-oracle-obi-product-assurance/</guid>
      <description>&lt;p&gt;Blogging from Oracle itself about OBIEE has always been a bit sparse, certainly in comparison to that which there is for core RDBMS.&lt;/p&gt;&#xA;&lt;p&gt;It&amp;rsquo;s good to see a new blog emerge in the last couple of months from OBI Product Assurance, including some nice &amp;rsquo;n spicy detailed config/tuning info.&lt;/p&gt;&#xA;&lt;p&gt;Find it here: &lt;a href=&#34;http://blogs.oracle.com/pa/&#34;&gt;http://blogs.oracle.com/pa/&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;There&amp;rsquo;s a couple more OBI blogs from Oracle, but both are fairly stale:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://blogs.oracle.com/robreynolds/&#34;&gt;Implementing Oracle BI &amp;amp; EPM Solutions - Rob Reynolds&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://blogs.oracle.com/bi/&#34;&gt;OBIEE Ramblings&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://blogs.oracle.com/bifoundation/&#34;&gt;Oracle BI Foundation&lt;/a&gt; (&lt;em&gt;thanks Daan for pointing this one out&lt;/em&gt;)&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>Did you hear that thunk? That was me falling off my chair in shock</title>
      <link>https://rmoff.net/2011/08/08/did-you-hear-that-thunk-that-was-me-falling-off-my-chair-in-shock/</link>
      <pubDate>Mon, 08 Aug 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/08/08/did-you-hear-that-thunk-that-was-me-falling-off-my-chair-in-shock/</guid>
      <description>&lt;p&gt;OK, &lt;a href=&#34;http://twitter.com/#!/rnm1978/status/98045868221018113&#34;&gt;a bit tired&lt;/a&gt; on a Monday morning, and so a bit sarcastic.&lt;/p&gt;&#xA;&lt;p&gt;I&amp;rsquo;ve not really fallen off my chair, but I am shocked. I honestly didn&amp;rsquo;t think it would happen.&lt;/p&gt;&#xA;&lt;p&gt;Oracle have finally released OBI 11g for HP-UX Itanium:&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/2011-08-08_0743_-0000.png&#34; alt=&#34;&#34; title=&#34;2011-08-08_0743_ 0000&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/2011-08-08_0749_-0000.png&#34; alt=&#34;&#34; title=&#34;2011-08-08_0749_ 0000&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;In other news, patchset 10.1.3.4.2 for OBI 10g was released today, I wonder if/when we&amp;rsquo;ll get an HP-UX Itanium version? The download page has it conspicuous by its absence even from &amp;ldquo;Coming Soon&amp;rdquo;: &lt;img src=&#34;https://rmoff.net/images/rnm1978/2011-08-08_0755_-0000.png&#34; alt=&#34;&#34; title=&#34;2011-08-08_0755_ 0000&#34;&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Have you defined CLIENT_ID in OBIEE yet?</title>
      <link>https://rmoff.net/2011/08/08/have-you-defined-client_id-in-obiee-yet/</link>
      <pubDate>Mon, 08 Aug 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/08/08/have-you-defined-client_id-in-obiee-yet/</guid>
      <description>&lt;p&gt;Have you defined CLIENT_ID in your OBIEE RPD yet? You really ought to.&lt;/p&gt;&#xA;&lt;p&gt;As well as helping track down users of troublesome queries, it also tags dump files with the OBIEE user of an offending query should the worst occur: &lt;img src=&#34;https://rmoff.net/images/rnm1978/2011-08-05_1600_-0000.png&#34; alt=&#34;&#34; title=&#34;2011-08-05_1600_ 0000&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;For details, see:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2010/01/26/identify-your-obiee-users-by-setting-client-id-in-oracle-connection/&#34;&gt;Identify your OBIEE users by setting Client ID in Oracle connection&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2011/02/02/instrumenting-obiee-for-tracing-oracle-db-calls/&#34;&gt;Instrumenting OBIEE for tracing Oracle DB calls&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>OBIEE 10.1.3.4.2 released</title>
      <link>https://rmoff.net/2011/08/08/obiee-10-1-3-4-2-released/</link>
      <pubDate>Mon, 08 Aug 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/08/08/obiee-10-1-3-4-2-released/</guid>
      <description>&lt;p&gt;A new version of OBI 10g (remember that?) has just been released, the Oracle twitter machine announced: &lt;img src=&#34;https://rmoff.net/images/rnm1978/2011-08-08_0737_-0000.png&#34; alt=&#34;&#34; title=&#34;2011-08-08_0737_ 0000&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;Along with presumably a bunch of bugfixes, the release notes list new functionality in catalog manager: &lt;img src=&#34;https://rmoff.net/images/rnm1978/2011-08-08_0800_-0000.png&#34; alt=&#34;&#34; title=&#34;2011-08-08_0800_ 0000&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://www.oracle.com/technetwork/middleware/bi-enterprise-edition/downloads/business-intelligence-10g-165415.html&#34;&gt;Download 10.1.3.4.2 from here&lt;/a&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>ODI 10g connectivity problem with OCI</title>
      <link>https://rmoff.net/2011/08/04/odi-10g-connectivity-problem-with-oci/</link>
      <pubDate>Thu, 04 Aug 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/08/04/odi-10g-connectivity-problem-with-oci/</guid>
      <description>&lt;p&gt;Trying to connect to a repository in ODI using OCI. Target database is Oracle 11.1.0.7. &lt;img src=&#34;https://rmoff.net/images/rnm1978/2011-08-04_1129_-0000.png&#34; alt=&#34;&#34; title=&#34;2011-08-04_1129_ 0000&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;Throws this error:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;com.sunopsis.sql.l: Oracle Data Integrator Timeout: connection with URL jdbc:oracle:oci8:@ODIPRD and user ODI_USER.&#xD;&#xA;&#x9;at com.sunopsis.sql.SnpsConnection.a(SnpsConnection.java)&#xD;&#xA;&#x9;at com.sunopsis.sql.SnpsConnection.t(SnpsConnection.java)&#xD;&#xA;&#x9;at com.sunopsis.sql.SnpsConnection.connect(SnpsConnection.java)&#xD;&#xA;&#x9;at com.sunopsis.tools.connection.DwgRepositoryConnectionsCreator.a(DwgRepositoryConnectionsCreator.java)&#xD;&#xA;&#x9;at com.sunopsis.tools.connection.DwgRepositoryConnectionsCreator.a(DwgRepositoryConnectionsCreator.java)&#xD;&#xA;&#x9;at com.sunopsis.graphical.l.oi.a(oi.java)&#xD;&#xA;[...]&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Normally this error would be caused by a misconfigured Oracle client. For example, a missing or incorrect tnsnames.ora entry. I validated these and got a successful response using tnsping.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Security issue on OBIEE 10.1.3.4.1, 11.1.1.3</title>
      <link>https://rmoff.net/2011/08/04/security-issue-on-obiee-10-1-3-4-1-11-1-1-3/</link>
      <pubDate>Thu, 04 Aug 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/08/04/security-issue-on-obiee-10-1-3-4-1-11-1-1-3/</guid>
      <description>&lt;p&gt;July&amp;rsquo;s &lt;a href=&#34;http://www.oracle.com/technetwork/topics/security/cpujuly2011-313328.html&#34;&gt;Critical Patch Update&lt;/a&gt; from Oracle includes &lt;a href=&#34;http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2011-2241&#34;&gt;CVE-2011-2241&lt;/a&gt;, which affects OBIEE versions 10.1.3.4.1 and 11.1.1.3. No details of the exploit other than it &amp;ldquo;allows remote attackers to affect availability via unknown vectors related to Analytics Server.&amp;rdquo;&lt;/p&gt;&#xA;&lt;p&gt;It is categorised with a &lt;a href=&#34;http://nvd.nist.gov/cvss.cfm?version=2&amp;amp;name=CVE-2011-2241&amp;amp;vector=(AV%3AN/AC%3AL/Au%3AN/C%3AN/I%3AN/A%3AP)&#34;&gt;CVSS score of 5&lt;/a&gt; (on a scale of 10), with no impact on Authentication, Confidentiality, or Integrity, and &amp;ldquo;Partial+&amp;rdquo; impact on Availability. So to a security-unqualified layman (me), it sounds like someone could remotely crash your NQSServer process, but not do any more damage than that.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Undocumented nqcmd parameters</title>
      <link>https://rmoff.net/2011/07/13/undocumented-nqcmd-parameters/</link>
      <pubDate>Wed, 13 Jul 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/07/13/undocumented-nqcmd-parameters/</guid>
      <description>&lt;p&gt;I noticed on &lt;a href=&#34;http://gerardnico.com/wiki/dat/obiee/&#34;&gt;Nico&amp;rsquo;s wiki&lt;/a&gt; (which is amazing by the way, it has &lt;strong&gt;so&lt;/strong&gt; much information in it) a &lt;a href=&#34;http://gerardnico.com/wiki/dat/obiee/nqcmd&#34;&gt;bunch of additional parameters for nqcmd&lt;/a&gt; other than those which are displayed in the default helptext (nqcmd -h).&lt;/p&gt;&#xA;&lt;p&gt;These are the additional ones:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;-b&#xD;&#xA;-w&#xD;&#xA;-c&#xD;&#xA;-n&#xD;&#xA;-r&#xD;&#xA;-t&#xD;&#xA;-T (a flag to turn on time statistics)&#xD;&#xA;-SmartDiff (a flag to enable SmartDiff tags in output)&#xD;&#xA;-P&#xD;&#xA;-impersonate &#xD;&#xA;-runas&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Most parameters don&amp;rsquo;t appear to work in default call of nqcmd in 10g and 11g, throwing a &lt;strong&gt;Argument error near:&lt;/strong&gt; error.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Oracle 11g - How to force a sql_id to use a plan_hash_value using SQL Baselines</title>
      <link>https://rmoff.net/2011/06/28/oracle-11g-how-to-force-a-sql_id-to-use-a-plan_hash_value-using-sql-baselines/</link>
      <pubDate>Tue, 28 Jun 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/06/28/oracle-11g-how-to-force-a-sql_id-to-use-a-plan_hash_value-using-sql-baselines/</guid>
      <description>&lt;p&gt;Here&amp;rsquo;s a scenario that&amp;rsquo;ll be depressingly familiar to most reading this: after ages of running fine, and no changes to the code, a query suddenly starts running for magnitudes longer than it used to.&lt;/p&gt;&#xA;&lt;p&gt;In this instance it was an ETL step which used to take c.1 hour, and was now at 5 hours and counting. Since it still hadn&amp;rsquo;t finished, and the gods had conspired to bring down Grid too (unrelated), I generated a SQL Monitor report to see what was happening:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Global statistics high/low values when using DBMS_STATS.COPY_TABLE_STATS</title>
      <link>https://rmoff.net/2011/06/15/global-statistics-highlow-values-when-using-dbms_stats-copy_table_stats/</link>
      <pubDate>Wed, 15 Jun 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/06/15/global-statistics-highlow-values-when-using-dbms_stats-copy_table_stats/</guid>
      <description>&lt;p&gt;There is a well-documented problem relating to DBMS_STATS.COPY_TABLE_STATS between partitions where high/low values of the partitioning key column were just copied verbatim from the source partition. This particular problem has now been patched (see &lt;a href=&#34;https://supporthtml.oracle.com/ep/faces/secure/km/DocumentDisplay.jspx?id=8318020.8&#34;&gt;8318020.8&lt;/a&gt;). For background, see Doug Burns&amp;rsquo; &lt;a href=&#34;http://oracledoug.com/serendipity/&#34;&gt;blog&lt;/a&gt; and his &lt;a href=&#34;http://oracledoug.com/serendipity/index.php?/archives/1632-Symposium-2011-My-Presentation.html&#34;&gt;excellent paper&lt;/a&gt; which covers the whole topic of statistics on partitioned tables.&lt;/p&gt;&#xA;&lt;p&gt;This post &lt;a href=&#34;http://blogs.oracle.com/optimizer/2009/02/maintaining_statistics_on_large_partitioned_tables.html&#34;&gt;Maintaining statistics on large partitioned tables&lt;/a&gt; on the Oracle Optimizer blog details what the dbms_stats.copy_table_stats does with regards to the high/low values:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Data Warehousing and Statistics in Oracle 11g - Automatic Optimizer Statistics Collection</title>
      <link>https://rmoff.net/2011/05/26/data-warehousing-and-statistics-in-oracle-11g-automatic-optimizer-statistics-collection/</link>
      <pubDate>Thu, 26 May 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/05/26/data-warehousing-and-statistics-in-oracle-11g-automatic-optimizer-statistics-collection/</guid>
      <description>&lt;h2 id=&#34;chucking-a-stick-in-the-spokes-of-your-carefully-tested-etlbi-&#34;&gt;Chucking a stick in the spokes of your carefully-tested ETL/BI &amp;hellip;&lt;/h2&gt;&#xA;&lt;p&gt;My opinion is that automated stats gathering for non-system objects should be disabled on Oracle Data Warehouses across all environments.&lt;/p&gt;&#xA;&lt;p&gt;All it does it cover up poor design or implementation which has omitted to consider statistics management. Once you get into the realms of millions or billions of rows of data, the automated housekeeping may well not have time to stat all of your tables on each run. And then it becomes a quasi-lottery when your tables will get processed. Or what if you&amp;rsquo;re working with intra-day loads (eg. near real-time) - the housekeeping job only runs once a day by default.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE performance - get your database sweating</title>
      <link>https://rmoff.net/2011/05/19/obiee-performance-get-your-database-sweating/</link>
      <pubDate>Thu, 19 May 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/05/19/obiee-performance-get-your-database-sweating/</guid>
      <description>&lt;p&gt;Just because something produces the correct numbers on the report, it doesn&amp;rsquo;t mean you can stop there.&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;How&lt;/strong&gt; you are producing those numbers matters, and matters a lot if you have an interest in the long-term health of your system and its ability to scale.&lt;/p&gt;&#xA;&lt;p&gt;OBIEE is the case in point here, but the principle applies to any architecture with &amp;gt;1 tiers or components.&lt;/p&gt;&#xA;&lt;p&gt;Let me start with a rhetorical question. The user has got a report which has ten rows of data. Which of the following methods is going to be a more efficient way to generate the report?&lt;/p&gt;</description>
    </item>
    <item>
      <title>Entertaining Exadata FUD from HP</title>
      <link>https://rmoff.net/2011/04/11/entertaining-exadata-fud-from-hp/</link>
      <pubDate>Mon, 11 Apr 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/04/11/entertaining-exadata-fud-from-hp/</guid>
      <description>&lt;p&gt;Chris Mellor at The Register posted an interesting article a couple of days ago, entitled &lt;a href=&#34;http://www.theregister.co.uk/2011/04/07/hp_violin_exadata_killer/&#34;&gt;HP and Violin build Oracle Exadata killer&lt;/a&gt;. The slidedeck has been removed from HP&amp;rsquo;s FTP site, but a bit of Google magic throws up &lt;a href=&#34;http://www.google.co.uk/search?q=feb2511_Iwicki.pdf+filetype:pdf&amp;amp;num=30&amp;amp;hl=en&amp;amp;safe=off&amp;amp;prmd=ivns&amp;amp;filter=0&#34;&gt;a couple of mirror copies&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;It&amp;rsquo;s an entertaining read (&lt;em&gt;&amp;ldquo;Do a Proof of Concept! 94% win rate!! We can and do win against Exadata!!&amp;rdquo;&lt;/em&gt;), and a nice illustration of the &lt;a href=&#34;http://en.wikipedia.org/wiki/Fear,_uncertainty_and_doubt&#34;&gt;FUD&lt;/a&gt; techniques that companies use in marketing their products against others. Greg Rahn has &lt;a href=&#34;http://structureddata.org/tag/fud/&#34;&gt;taken Netezza to task in the past&lt;/a&gt; for this, and to be fair at least Netezza had a serious white paper to back up their arguments. HP&amp;rsquo;s deck (including choice sections such as &amp;ldquo;How to sell against Exadata&amp;rdquo;) is IMHO nothing more than a biased set of arguments for salespeople to use to attempt to bullshit customers.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Oracle documentation - available on Kindle and iPad</title>
      <link>https://rmoff.net/2011/04/07/oracle-documentation-available-on-kindle-and-ipad/</link>
      <pubDate>Thu, 07 Apr 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/04/07/oracle-documentation-available-on-kindle-and-ipad/</guid>
      <description>&lt;p&gt;Whilst perusing the Oracle database documentation, I noticed something which caught my eye:&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/snag-2011-04-07-13-11-45-0000.png&#34; alt=&#34;&#34; title=&#34;SNAG-2011-04-07-13.11.45-0000&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;As well as reading the documentation online as HTML or downloading as PDF for viewing on your computer etc, you can also download it in formats (Mobi and ePub) designed for eReaders such as the Kindle and iPad (the latter obviously isn&amp;rsquo;t &amp;ldquo;just&amp;rdquo; an eReader). For information on format support, there&amp;rsquo;s a handy table &lt;a href=&#34;http://en.wikipedia.org/wiki/Comparison_of_e-book_formats#Supporting_Hardware&#34;&gt;on Wikipedia&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;It looks like the availability of mobi/epub files isn&amp;rsquo;t universal. For example, the OBI 11g documentation appears still to be HTML &amp;amp; ZIP only. And whilst the Database documentation doesn&amp;rsquo;t link to the mobi files on the contents page, only each document&amp;rsquo;s TOC, the Essbase documentation does:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Oracle XE 11gR2 installation - &#34;OracleXEService should not be installed already&#34;</title>
      <link>https://rmoff.net/2011/04/04/oracle-xe-11gr2-installation-oraclexeservice-should-not-be-installed-already/</link>
      <pubDate>Mon, 04 Apr 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/04/04/oracle-xe-11gr2-installation-oraclexeservice-should-not-be-installed-already/</guid>
      <description>&lt;p&gt;Oracle XE 11gR2 beta has just been released, some &lt;a href=&#34;http://technology.amis.nl/blog/11785/oracle-xe-11gr2-the-free-express-edition-for-oracle-database-11gr2&#34;&gt;details here&lt;/a&gt; and &lt;a href=&#34;http://www.oracle.com/technetwork/database/express-edition/11gxe-beta-download-302519.html&#34;&gt;download here&lt;/a&gt;. It&amp;rsquo;s not a great deal of use for sandboxing DWH-specific stuff, given this list of &lt;strong&gt;excluded&lt;/strong&gt; functionality (and this is by no means everything that&amp;rsquo;s not included):&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Bitmapped index, bitmapped join index, and bitmap plan conversions&lt;/li&gt;&#xA;&lt;li&gt;Oracle Partitioning&lt;/li&gt;&#xA;&lt;li&gt;Parallel Data Pump Export/Import&lt;/li&gt;&#xA;&lt;li&gt;Parallel query/DML&lt;/li&gt;&#xA;&lt;li&gt;Parallel Statement Queuing&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;(&lt;a href=&#34;http://download.oracle.com/docs/cd/E17781_01/license.112/e18068/toc.htm#BABJBGGA&#34;&gt;source&lt;/a&gt;)&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;However, it&amp;rsquo;s always interesting to have to hand for trying out other things. And I like playing with new toys :)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Getting good quality I/O throughput data</title>
      <link>https://rmoff.net/2011/03/11/getting-good-quality-io-throughput-data/</link>
      <pubDate>Fri, 11 Mar 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/03/11/getting-good-quality-io-throughput-data/</guid>
      <description>&lt;p&gt;This post expands on one I made last year &lt;a href=&#34;https://rmoff.net/2010/09/14/the-danger-of-averages-measuring-io-throughput/&#34;&gt;here&lt;/a&gt; about sampling frequency (of I/O throughput, but it&amp;rsquo;s a generic concept). The background to this is my analysis of the performance and capacity of our data warehouse on Oracle 11g.&lt;/p&gt;&#xA;&lt;p&gt;Before I get too boring, here&amp;rsquo;s the fun bit:&lt;/p&gt;&#xA;&lt;h2 id=&#34;pork-pies-per-hour-pph&#34;&gt;Pork Pies per Hour (PP/h)&lt;/h2&gt;&#xA;&lt;p&gt;Jim wants to enter a championship pork-pie eating competition. He&amp;rsquo;s timed himself practising and over the course of an &lt;strong&gt;hour&lt;/strong&gt; he eats &lt;strong&gt;four pork pies&lt;/strong&gt;. So we might say that his Pork Pies per Hour (PP/h) rate is 4.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Comparing methods for recording I/O - V$SYSSTAT vs HP Measureware</title>
      <link>https://rmoff.net/2011/03/09/comparing-methods-for-recording-io-vsysstat-vs-hp-measureware/</link>
      <pubDate>Wed, 09 Mar 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/03/09/comparing-methods-for-recording-io-vsysstat-vs-hp-measureware/</guid>
      <description>&lt;p&gt;I wrote last year about &lt;a href=&#34;https://rmoff.net/2010/10/26/graphing-io-data-using-gnuplot-and-oracle-vsysstat/&#34;&gt;Graphing I/O data using gnuplot and Oracle V$SYSSTAT&lt;/a&gt;, using a script from Kevin Closson in his article &lt;a href=&#34;http://kevinclosson.wordpress.com/2009/04/28/how-to-produce-raw-spreadsheet-ready-physical-io-data-with-plsql-good-for-exadata-good-for-traditional-storage/&#34;&gt;How To Produce Raw, Spreadsheet-Ready Physical I/O Data With PL/SQL. Good For Exadata, Good For Traditional Storage&lt;/a&gt;. Here I&amp;rsquo;ve got a simple comparison of the data recorded through this script (in essence, Oracle&amp;rsquo;s V$SYSSTAT), and directly on the OS through HP&amp;rsquo;s MeasureWare. It&amp;rsquo;s graphed out with my new favourite tool, rrdtool:&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE Systems Management - dodgy counter behaviour</title>
      <link>https://rmoff.net/2011/03/08/obiee-systems-management-dodgy-counter-behaviour/</link>
      <pubDate>Tue, 08 Mar 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/03/08/obiee-systems-management-dodgy-counter-behaviour/</guid>
      <description>&lt;p&gt;Over the last few months I&amp;rsquo;ve been doing a lot of exploring of OBIEE Systems Management data, covered in a mini-series of blog posts, &lt;a href=&#34;https://rmoff.net/2010/12/06/collecting-obiee-systems-management-data-with-jmx/&#34;&gt;Collecting OBIEE systems management data&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;There are a vast number of counters exposed, ranging from the very interesting (Active Sessions, Cache Hits, etc) to the less so (Total Query Piggybacks, although for some seriously hardcore performance tuning even this may be of interest).&lt;/p&gt;&#xA;&lt;p&gt;This short blog post is about a couple of counters which I&amp;rsquo;ve been monitoring but which looks to not be entirely reliable. Both are in the Oracle BI DB Connection Pool, and are:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Shiny new geek toys -- rrdtool and screen</title>
      <link>https://rmoff.net/2011/03/01/shiny-new-geek-toys-rrdtool-and-screen/</link>
      <pubDate>Tue, 01 Mar 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/03/01/shiny-new-geek-toys-rrdtool-and-screen/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve added two new toys to my geek arsenal today. First is one with which I&amp;rsquo;ve dabbled before, but struggled to master. The second is a revelation to me and which I discovered courtesy of twitter.&lt;/p&gt;&#xA;&lt;h2 id=&#34;rrdtool&#34;&gt;rrdtool&lt;/h2&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://oss.oetiker.ch/rrdtool&#34;&gt;rrdtool&lt;/a&gt; is a data collection and graphing tool which I&amp;rsquo;ve been aware of for a while. I wanted to use it when I wrote about &lt;a href=&#34;https://rmoff.net/2010/12/06/collecting-obiee-systems-management-data-with-jmx/&#34;&gt;Collecting OBIEE systems management data&lt;/a&gt; with JMX, but couldn&amp;rsquo;t get it to work. I&amp;rsquo;ll not lie to you - it is a bitch to work with at first. Or put a more polite way, it has a steep learning curve. But when you reach the top of the curve and realise its potential&amp;hellip;wow. You&amp;rsquo;ll soon understand why it is so widely used. I plan to write this up soon, but it let me draw nice graphs like this: &lt;img src=&#34;https://rmoff.net/images/rnm1978/graph.png&#34; alt=&#34;&#34; title=&#34;OBIEE - OBIA&#34;&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Changing LDAP settings in an OBIEE RPD with UDML</title>
      <link>https://rmoff.net/2011/02/23/changing-ldap-settings-in-an-obiee-rpd-with-udml/</link>
      <pubDate>Wed, 23 Feb 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/02/23/changing-ldap-settings-in-an-obiee-rpd-with-udml/</guid>
      <description>&lt;p&gt;A chap called Kevin posted a comment on a &lt;a href=&#34;https://rmoff.net/2009/09/09/syntax-for-admintool.exe-command-line-script/&#34;&gt;previous posting of mine&lt;/a&gt; asking&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;em&gt;did you ever come across anything that could be used to change the LDAP server settings from a command line (admintool.exe, UDML, or otherwise)?&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;I did a quick play around with some UDML and it appears that you can.&lt;/p&gt;&#xA;&lt;h2 id=&#34;set-up-the-initial-ldap-server-definition-in-the-rpd&#34;&gt;Set up the initial LDAP server definition in the RPD&lt;/h2&gt;&#xA;&lt;p&gt;First I added a dummy LDAP server to samplesales.rpd: &lt;img src=&#34;https://rmoff.net/images/rnm1978/snag-2011-02-23-07-46-02-0000.png&#34; alt=&#34;&#34; title=&#34;SNAG-2011-02-23-07.46.02-0000&#34;&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Instrumenting OBIEE for tracing Oracle DB calls</title>
      <link>https://rmoff.net/2011/02/02/instrumenting-obiee-for-tracing-oracle-db-calls/</link>
      <pubDate>Wed, 02 Feb 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/02/02/instrumenting-obiee-for-tracing-oracle-db-calls/</guid>
      <description>&lt;p&gt;Cary Millsap recently published a paper &amp;ldquo;&lt;a href=&#34;http://carymillsap.blogspot.com/2011/01/new-paper-mastering-performance-with.html&#34;&gt;Mastering Performance with Extended SQL Trace&lt;/a&gt;&amp;rdquo; describing how to use Oracle trace to assist with troubleshooting the performance of database queries. As with all of Cary Millsap&amp;rsquo;s papers it is superbly written, presenting very detailed information in a clear and understandable way. (and yes I do have a &lt;a href=&#34;http://dbakevlar.com/?p=46&#34;&gt;DBA crush&lt;/a&gt; ;-)) It discusses how you can automate the tracing of specific sessions on the database, and requiring the application to be appropriately instrumented. This reminded me of a post that I made almost exactly 12 months ago &lt;a href=&#34;https://rmoff.net/2010/01/26/identify-your-obiee-users-by-setting-client-id-in-oracle-connection/&#34;&gt;here&lt;/a&gt;, where I explained how to pass through the username of the OBIEE user to the database. Initially I thought it would be useful simply for being able to pin a rogue query to an end-user, but reading Cary&amp;rsquo;s paper made me realise there is more potential to it.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Materialised Views - PCT Partition Truncation</title>
      <link>https://rmoff.net/2011/01/08/materialised-views-pct-partition-truncation/</link>
      <pubDate>Sat, 08 Jan 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/01/08/materialised-views-pct-partition-truncation/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve been doing some work recently that involved the use of Materialised Views on Oracle 11g (11.1.0.7), particularly around PCT refresh. There are some things that are not clear from the documentation, or are actually bugs so far as I&amp;rsquo;m concerned, and I&amp;rsquo;ve detailed these below.&lt;/p&gt;&#xA;&lt;p&gt;In this example I was working on part of a DWH with c.2 millions rows aggregated up daily. One of the things that I spent a long time trying to get to work was Partition Truncation when using PCT refresh. We had tried and discarded &amp;ldquo;FAST&amp;rdquo; refresh as being completely non-performant for our volumes.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Oracle Whitepaper - &#34;Best Practices for a Data Warehouse on Oracle Database 11g&#34;</title>
      <link>https://rmoff.net/2011/01/05/oracle-whitepaper-best-practices-for-a-data-warehouse-on-oracle-database-11g/</link>
      <pubDate>Wed, 05 Jan 2011 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2011/01/05/oracle-whitepaper-best-practices-for-a-data-warehouse-on-oracle-database-11g/</guid>
      <description>&lt;p&gt;Poking around on My Oracle Support today, I found a link to a white paper dated November 2010, titled &amp;ldquo;Best Practices for a Data Warehouse on Oracle Database 11g&amp;rdquo;. It&amp;rsquo;s new to me and I&amp;rsquo;ve not noticed a blog post announcing it, so I thought I&amp;rsquo;d share it here. It&amp;rsquo;s by Maria Colgan, who has posted in the past on both the &lt;a href=&#34;http://blogs.oracle.com/optimizer/&#34;&gt;Inside the Oracle Optimizer&lt;/a&gt; blog and &lt;a href=&#34;http://blogs.oracle.com/datawarehousing/&#34;&gt;The Data Warehouse Insider&lt;/a&gt; blog. Here&amp;rsquo;s the link to it:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Data Warehousing and Statistics in Oracle 11g - incremental global statistics</title>
      <link>https://rmoff.net/2010/12/30/data-warehousing-and-statistics-in-oracle-11g-incremental-global-statistics/</link>
      <pubDate>Thu, 30 Dec 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/12/30/data-warehousing-and-statistics-in-oracle-11g-incremental-global-statistics/</guid>
      <description>&lt;p&gt;This is a series of posts where I hope to humbly plug some gaps in the information available (or which has escaped my &lt;a href=&#34;http://www.urbandictionary.com/define.php?term=google-fu&#34;&gt;google-fu&lt;/a&gt;) regarding statistics management in Oracle 11g specific to Data Warehousing.&lt;/p&gt;&#xA;&lt;p&gt;Incremental Global Statistics is new functionality in Oracle 11g (and 10.2.0.4?) and is explained in depth in several places including:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://download.oracle.com/docs/cd/B28359_01/server.111/b28274/stats.htm#i42218&#34;&gt;Oracle® Database Performance Tuning Guide - Statistics on Partitioned Objects&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://structureddata.org/2008/07/16/oracle-11g-incremental-global-statistics-on-partitioned-tables/&#34;&gt;Greg Rahn - Oracle 11g: Incremental Global Statistics On Partitioned Tables&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://blogs.oracle.com/optimizer/2009/02/maintaining_statistics_on_large_partitioned_tables.html&#34;&gt;Inside the Oracle Optimiser - Maintaining statistics on large partitioned tables&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://www.oraclegeek.net/downloads/One_Pass_Distinct_Sampling.ppt&#34;&gt;Amit Poddar - One Pass Distinct Sampling&lt;/a&gt; (ppt - slides 52 onwards are most relevant)&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;In essence, Oracle maintains information about each partition when statistics is gathered on the partition, and it uses this to work out the global statistics - without having to scan the whole table. For a more detailed description, see the above links. It&amp;rsquo;s important to note that this is not the same as aggregated global statistics (which Doug Burns &lt;a href=&#34;http://oracledoug.com/serendipity/index.php?/archives/1590-Statistics-on-Partitioned-Tables-Contents.html&#34;&gt;covers in detail here&lt;/a&gt;)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Adding OBIEE monitoring graphs into OAS</title>
      <link>https://rmoff.net/2010/12/06/adding-obiee-monitoring-graphs-into-oas/</link>
      <pubDate>Mon, 06 Dec 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/12/06/adding-obiee-monitoring-graphs-into-oas/</guid>
      <description>&lt;h2 id=&#34;introduction&#34;&gt;Introduction&lt;/h2&gt;&#xA;&lt;p&gt;This is the third part of three detailed articles making up a mini-series about &lt;a href=&#34;https://rmoff.net/2010/12/06/obiee-monitoring/&#34;&gt;OBIEE monitoring&lt;/a&gt;. It demonstrates how to capture OBIEE performance information, and optionally graph it out and serve it through an auto-updating webpage.&lt;/p&gt;&#xA;&lt;p&gt;This final article describes how to bolt on to OAS a simple web page hosting the graphs that you created in &lt;a href=&#34;https://rmoff.net/2010/12/06/charting-obiee-performance-data-with-gnuplot/&#34;&gt;part 2&lt;/a&gt;, plotting data from OBIEE collected in &lt;a href=&#34;https://rmoff.net/2010/12/06/collecting-obiee-systems-management-data-with-jmx/&#34;&gt;part 1&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;h2 id=&#34;the-webpage&#34;&gt;The webpage&lt;/h2&gt;&#xA;&lt;p&gt;This is just an old-school basic HTML page, with a meta-refresh tag (which note that Chrome doesn&amp;rsquo;t work with) and img tags:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Charting OBIEE performance data with gnuplot</title>
      <link>https://rmoff.net/2010/12/06/charting-obiee-performance-data-with-gnuplot/</link>
      <pubDate>Mon, 06 Dec 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/12/06/charting-obiee-performance-data-with-gnuplot/</guid>
      <description>&lt;h2 id=&#34;introduction&#34;&gt;Introduction&lt;/h2&gt;&#xA;&lt;p&gt;This is the second part of three detailed articles making up a mini-series about &lt;a href=&#34;https://rmoff.net/2010/12/06/obiee-monitoring/&#34;&gt;OBIEE monitoring&lt;/a&gt;. It demonstrates how to capture OBIEE performance information, and optionally graph it out and serve it through an auto-updating webpage.&lt;/p&gt;&#xA;&lt;p&gt;This article takes data that &lt;a href=&#34;https://rmoff.net/2010/12/06/collecting-obiee-systems-management-data-with-jmx/&#34;&gt;part one&lt;/a&gt; showed you how to collect into a tab-separated file that looks something like this:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;2010-11-29-14:48:18     1       0       11      0       3       2       1       676     340     0       53      1       0       41      0       3       0&#xD;&#xA;2010-11-29-14:49:18     1       0       11      0       3       2       1       676     0       0       0       1       0       0       0       3       0&#xD;&#xA;2010-11-29-14:50:18     2       0       16      1       4       3       1       679     0       0       0       1       0       0       0       4       0&#xD;&#xA;2010-11-29-14:51:18     2       2       19      1       4       3       1       679     32      0       53      1       0       58      0       4       0&#xD;&#xA;2010-11-29-14:52:18     2       1       19      1       4       3       4       682     0       0       0       1       0       0       0       4       0&#xD;&#xA;2010-11-29-14:53:18     2       1       19      1       4       3       4       682     0       0       0       1       0       0       0       4       0&#xD;&#xA;2010-11-29-14:54:18     2       0       19      1       4       3       1       682     0       0       0       1       0       0       0       4       0&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;and plot it into something like looks like this: &lt;img src=&#34;https://rmoff.net/images/rnm1978/summary-6hr.png&#34; alt=&#34;&#34; title=&#34;summary.6hr&#34;&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Collecting OBIEE systems management data with jmx</title>
      <link>https://rmoff.net/2010/12/06/collecting-obiee-systems-management-data-with-jmx/</link>
      <pubDate>Mon, 06 Dec 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/12/06/collecting-obiee-systems-management-data-with-jmx/</guid>
      <description>&lt;h2 id=&#34;introduction&#34;&gt;Introduction&lt;/h2&gt;&#xA;&lt;p&gt;This is the first part of three detailed articles making up a mini-series about &lt;a href=&#34;https://rmoff.net/2010/12/06/obiee-monitoring/&#34;&gt;OBIEE monitoring&lt;/a&gt;. It demonstrates how to capture OBIEE performance information, and optionally graph it out and serve it through an auto-updating webpage.&lt;/p&gt;&#xA;&lt;p&gt;For some background on OBIEE&amp;rsquo;s Systems Management component, along with JMX and MBeans, &lt;a href=&#34;https://rmoff.net/2009/07/22/oracle-bi-management-systems-management-mbeans/&#34;&gt;see here&lt;/a&gt; and &lt;a href=&#34;https://rmoff.net/categories/jmx/&#34;&gt;here&lt;/a&gt;. The following assumes you know your mbeans from coffee beans and jmx from a bmx.&lt;/p&gt;&#xA;&lt;p&gt;The metric collection is built around the &lt;a href=&#34;http://code.google.com/p/jmxsh/&#34;&gt;jmxsh&lt;/a&gt; tool. This is similar to &lt;a href=&#34;http://wiki.cyclopsgroup.org/jmxterm&#34;&gt;jmxterm&lt;/a&gt; and both provide command-line access to jmx. Once it&amp;rsquo;s commandline, it&amp;rsquo;s scriptable :)&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE monitoring</title>
      <link>https://rmoff.net/2010/12/06/obiee-monitoring/</link>
      <pubDate>Mon, 06 Dec 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/12/06/obiee-monitoring/</guid>
      <description>&lt;p&gt;Those of you who read my blog regularly may have noticed I have a slight obsession with the OBIEE systems management capability which is exposed through &lt;a href=&#34;https://rmoff.net/categories/jmx/&#34;&gt;JMX&lt;/a&gt;. Venkat has &lt;a href=&#34;http://www.rittmanmead.com/2010/11/29/oracle-bi-ee-11g-systems-management-api-jmx-mbeans-dynamic-user-generation/&#34;&gt;blogged this week about JMX in OBI11g&lt;/a&gt;, and it&amp;rsquo;s clearly a technology worth understanding properly. I&amp;rsquo;ve recently been tinkering with how to make use of it for monitoring purposes, most recently using JConsole and &lt;a href=&#34;https://rmoff.net/2010/11/04/a-poor-mans-obiee-embi-management-pack/&#34;&gt;discussed here&lt;/a&gt;. What follows is an extension of this idea, cobbled together with a bit of shell scripting, awk, gnuplot, and &lt;a href=&#34;http://www.google.co.uk/search?q=blue+peter+sticky+backed+plastic&#34;&gt;sticky backed plastic&lt;/a&gt;. It&amp;rsquo;s built on OBIEE 10g - for OBI11g it may differ (although I understand that Performance MBeans still exist).&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE 10g - javahost hang</title>
      <link>https://rmoff.net/2010/12/03/obiee-10g-javahost-hang/</link>
      <pubDate>Fri, 03 Dec 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/12/03/obiee-10g-javahost-hang/</guid>
      <description>&lt;p&gt;Hot on the heels of &lt;a href=&#34;https://rmoff.net/2010/12/02/troubleshooting-obiee-ldap-adsi-authentication/&#34;&gt;one problem&lt;/a&gt;, another has just reared its head.&lt;/p&gt;&#xA;&lt;p&gt;Users started reporting an error with reports that included charts:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;Chart server does not appear to be responding in a timely fashion. It may be under heavy load or unavailable.&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/snag-2010-12-03-09-51-31-0000.png&#34; alt=&#34;&#34; title=&#34;SNAG-2010-12-03-09.51.31-0000&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;Set up is a OBIEE 10.1.3.4.1 two-server deployment with BI/PS/Javahost clustered and loadbalanced throughout.&lt;/p&gt;&#xA;&lt;h2 id=&#34;diagnostics&#34;&gt;Diagnostics&lt;/h2&gt;&#xA;&lt;p&gt;Javahost was running, and listening, on both servers:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;$ps -ef|grep javahost&#xD;&#xA;obieeadm 14076     1  0  Nov 25  ?         9:23 /app/oracle/product/OracleAS_1/jdk/bin/IA64N/java -server -classpath /app/oracle/product/obiee/web/javahost/lib/core/sautils.ja&#xD;&#xA;$netstat -a|grep 9810|grep LISTEN&#xD;&#xA;tcp        0      0  *.9810                 *.*                     LISTEN&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;In Javahost log file on both servers there were these errors reported, but since javahost had started over a week ago:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Troubleshooting OBIEE - LDAP (ADSI) authentication</title>
      <link>https://rmoff.net/2010/12/02/troubleshooting-obiee-ldap-adsi-authentication/</link>
      <pubDate>Thu, 02 Dec 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/12/02/troubleshooting-obiee-ldap-adsi-authentication/</guid>
      <description>&lt;p&gt;They say about travelling that it&amp;rsquo;s the journey and not the destination, and the same is true with this problem we hit during a deployment to Production.&lt;/p&gt;&#xA;&lt;p&gt;We were deploying a new OBIEE 10g implementation, with authentication provided by Microsoft Active Directory (AD) through the LDAP functionality in OBIEE. As a side note, it&amp;rsquo;s a rather nice way to do authentication, although maybe I&amp;rsquo;m biased coming from our previous implementation which used EBS integrated authentication and was a bugger to set up and work with.&lt;/p&gt;</description>
    </item>
    <item>
      <title>A Poor Man&#39;s OBIEE EM/BI Management Pack</title>
      <link>https://rmoff.net/2010/11/04/a-poor-mans-obiee-embi-management-pack/</link>
      <pubDate>Thu, 04 Nov 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/11/04/a-poor-mans-obiee-embi-management-pack/</guid>
      <description>&lt;p&gt;Folk from Yorkshire are tight, so the stereotype goes. So here&amp;rsquo;s a cheap-ass way to monitor OBIEE 10g using nothing but the OBIEE built-in systemsmanagement component, the jmx agent, and jconsole (which is part of the standard Java distribution):&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/poor-mans-em_3.png&#34; alt=&#34;&#34; title=&#34;poor-man&#39;s-EM_3&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;From here you can also export to CSV the various counters, and then store history, plot it out with gnuplot or Excel, etc.&lt;/p&gt;&#xA;&lt;p&gt;If anyone&amp;rsquo;s interested let me know and I&amp;rsquo;ll document a bit more about how I did this, but it&amp;rsquo;s basically building on previous work &lt;a href=&#34;https://rmoff.net/categories/jmx/&#34;&gt;I&amp;rsquo;ve documented around jmx and OBIEE&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Analysing ODI batch performance</title>
      <link>https://rmoff.net/2010/11/03/analysing-odi-batch-performance/</link>
      <pubDate>Wed, 03 Nov 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/11/03/analysing-odi-batch-performance/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve been involved with some performance work around an ODI DWH load batch. The batch comprises well over 1000 tasks in ODI, and whilst the Operator console is not a bad interface, it&amp;rsquo;s not very easy to spot the areas consuming the most runtime.&lt;/p&gt;&#xA;&lt;p&gt;Here&amp;rsquo;s a set of SQL statements to run against the ODI work repository tables to help you methodically find the steps of most interest for tuning efforts.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Does this summarise your system development &amp; support ethos?</title>
      <link>https://rmoff.net/2010/10/27/does-this-summarise-your-system-development-support-ethos/</link>
      <pubDate>Wed, 27 Oct 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/10/27/does-this-summarise-your-system-development-support-ethos/</guid>
      <description>&lt;p&gt;I heard this on &lt;a href=&#34;http://www.bbc.co.uk/programmes/b00v1qrz&#34;&gt;Thinking Allowed&lt;/a&gt; and thought how applicable it was to the attitudes that you can sometimes encounter in both systems development, and the support of production systems:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;em&gt;“Each uneventful day that passes reinforces a steadily growing false sense of confidence that everything is all right – that I, we, my group must be OK because the way we did things today resulted in no adverse consequences.”&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;by &lt;a href=&#34;http://drfd.hbs.edu/fit/public/facultyInfo.do?facInfo=bio&amp;amp;facId=164841&#34;&gt;Scott Snook&lt;/a&gt; (Senior Lecturer in the Organizational Behavior unit at Harvard Business School )&lt;/p&gt;</description>
    </item>
    <item>
      <title>Two excellent OBI presentations from Jeff McQuigg</title>
      <link>https://rmoff.net/2010/10/27/two-excellent-obi-presentations-from-jeff-mcquigg/</link>
      <pubDate>Wed, 27 Oct 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/10/27/two-excellent-obi-presentations-from-jeff-mcquigg/</guid>
      <description>&lt;p&gt;Jeff McQuigg has posted two presentations that he gave at Openworld 2010 on his website here: &lt;a href=&#34;http://greatobi.wordpress.com/2010/10/26/oow-presos/&#34;&gt;http://greatobi.wordpress.com/2010/10/26/oow-presos/&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;They&amp;rsquo;re full of real content and well worth a read. There&amp;rsquo;s excellent levels of detail and plenty to think about if you&amp;rsquo;re involved in OBI or DW development projects.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Graphing I/O data using gnuplot and Oracle V$SYSSTAT</title>
      <link>https://rmoff.net/2010/10/26/graphing-io-data-using-gnuplot-and-oracle-vsysstat/</link>
      <pubDate>Tue, 26 Oct 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/10/26/graphing-io-data-using-gnuplot-and-oracle-vsysstat/</guid>
      <description>&lt;p&gt;Continuing in the beard-scratching theme of Unix related posts (&lt;a href=&#34;https://rmoff.net/2010/10/19/awk-split-a-fixed-width-file-into-separate-files-named-on-content/&#34;&gt;previously - awk&lt;/a&gt;), here&amp;rsquo;s a way to graph out the I/O profile of your Oracle database via the Oracle metrics in &lt;a href=&#34;http://download.oracle.com/docs/cd/B28359_01/server.111/b28320/dynviews_3086.htm#REFRN30272&#34;&gt;gv$sysstat&lt;/a&gt;, and &lt;a href=&#34;http://www.gnuplot.info/&#34;&gt;gnuplot&lt;/a&gt;. This is only the system I/O as observed by Oracle, so for belts &amp;amp; braces (or to placate a cynical sysadmin ;-)) you may want to cross-reference it with something like sar.&lt;/p&gt;&#xA;&lt;p&gt;First, a pretty picture of what you can get:&lt;/p&gt;</description>
    </item>
    <item>
      <title>awk - split a fixed width file into separate files named on content</title>
      <link>https://rmoff.net/2010/10/19/awk-split-a-fixed-width-file-into-separate-files-named-on-content/</link>
      <pubDate>Tue, 19 Oct 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/10/19/awk-split-a-fixed-width-file-into-separate-files-named-on-content/</guid>
      <description>&lt;p&gt;More of a unix thing than DW/BI this post, but I have a beard so am semi-qualified&amp;hellip;.&lt;/p&gt;&#xA;&lt;p&gt;The requirement was to improve the performance of some ODI processing that as part of its work was taking one huge input file, and splitting it into chunks based on content in the file. To add some (minor) spice the file was fixed width with no deliminators, so the easy awk answers that I found on google weren&amp;rsquo;t applicable.&lt;/p&gt;</description>
    </item>
    <item>
      <title>When is a bug not a bug? When it&#39;s a &#34;design decision&#34;</title>
      <link>https://rmoff.net/2010/10/18/when-is-a-bug-not-a-bug-when-its-a-design-decision/</link>
      <pubDate>Mon, 18 Oct 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/10/18/when-is-a-bug-not-a-bug-when-its-a-design-decision/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://rmoff.net/2010/09/02/misbehaving-informatica-kills-oracle/&#34;&gt;Last month I wrote about a problem&lt;/a&gt; that Informatica as part of OBIA was causing us, wherein an expired database account would bring Oracle down by virtue of multiple connections from Informatica.&lt;/p&gt;&#xA;&lt;p&gt;I raised an SR with Oracle (under OBIA support), who after some back-and-forth with Informatica, were told:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;em&gt;&lt;strong&gt;This is not a bug. That the two error messages coming back from Oracle are handled differently is the result of a design decision and as such not a product fault.&lt;/strong&gt;&lt;/em&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>A good maxim to bear in mind when designing reports</title>
      <link>https://rmoff.net/2010/09/23/a-good-maxim-to-bear-in-mind-when-designing-reports/</link>
      <pubDate>Thu, 23 Sep 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/09/23/a-good-maxim-to-bear-in-mind-when-designing-reports/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;http://www.codinghorror.com/blog/2004/02/about-me.html&#34;&gt;Jeff Atwood&lt;/a&gt; of &lt;a href=&#34;http://www.codinghorror.com/&#34;&gt;Coding Horror&lt;/a&gt; fame &lt;a href=&#34;http://twitter.com/codinghorror/statuses/25282710947&#34;&gt;observed&lt;/a&gt;:&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/snag-2010-09-23-08-02-18-0000.png&#34; alt=&#34;&#34; title=&#34;SNAG-2010-09-23-08.02.18-0000&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;This is sage advice (if a little crude :-)) to bear in mind when building reports for users.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Better safe than sorry...sanitising DB input</title>
      <link>https://rmoff.net/2010/09/22/better-safe-than-sorry-sanitising-db-input/</link>
      <pubDate>Wed, 22 Sep 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/09/22/better-safe-than-sorry-sanitising-db-input/</guid>
      <description>&lt;p&gt;As &lt;a href=&#34;http://www.f-secure.com/weblog/archives/00002034.html&#34;&gt;Twitter learnt yesterday&lt;/a&gt;, you should always sanitise user input. I was amused to see My Oracle Support doing so&amp;hellip;.&lt;a href=&#34;http://www.google.co.uk/search?q=recursion&#34;&gt;recursively&lt;/a&gt; :)&lt;/p&gt;&#xA;&lt;p&gt;The apostrophe in &amp;ldquo;doesn&amp;rsquo;t&amp;rdquo; got escaped once, and then again, and then again, and then again, and then again &amp;hellip;&amp;hellip;&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/snag-22-09-2010-13-32-08-0000.png&#34; alt=&#34;&#34; title=&#34;SNAG-22-09-2010-13.32.08-0000&#34;&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>TortoiseSVN doesn&#39;t prompt for authentication</title>
      <link>https://rmoff.net/2010/09/21/tortoisesvn-doesnt-prompt-for-authentication/</link>
      <pubDate>Tue, 21 Sep 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/09/21/tortoisesvn-doesnt-prompt-for-authentication/</guid>
      <description>&lt;p&gt;Here&amp;rsquo;s one in the series of stupid things I&amp;rsquo;ve done but which Google has thrown no answers, so I post it here to help out fellow idiots.&lt;/p&gt;&#xA;&lt;p&gt;Today&amp;rsquo;s episode involves our SCM tool, TortoiseSVN. I&amp;rsquo;d been happily using it for over a year, when suddenly I couldn&amp;rsquo;t commit any more. I could browse and checkout to my heart&amp;rsquo;s content, but when I tried to commit, &lt;em&gt;boom&lt;/em&gt;:&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;Commit failed (details follow): Authorization failed&lt;/strong&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>The danger of averages - Measuring I/O throughput</title>
      <link>https://rmoff.net/2010/09/14/the-danger-of-averages-measuring-io-throughput/</link>
      <pubDate>Tue, 14 Sep 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/09/14/the-danger-of-averages-measuring-io-throughput/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;http://www.jameskoopmann.com/scripts/wrh_sysstat_ioworkload_ALL.sql&#34;&gt;This query&lt;/a&gt;, based on AWR snapshots on sys.wrh$_sysstat, includes in its metrics the I/O read throughput for a given snapshot duration.&lt;/p&gt;&#xA;&lt;p&gt;However it&amp;rsquo;s important to realise the huge limitation to this figure - it&amp;rsquo;s an average. It completely shoots you in the foot if you&amp;rsquo;re looking at capacity requirements.&lt;/p&gt;&#xA;&lt;p&gt;Consider this real-life example extracted from the above query:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;Timestamp&#x9;&#x9;&#x9;&#x9;&#x9;Total Read MBPS&#xD;&#xA;===========================================&#xD;&#xA;14-SEP-10 05.15.12.660      113.748&#xD;&#xA;14-SEP-10 06.00.40.953      202.250&#xD;&#xA;14-SEP-10 06.45.52.750       34.649&#xD;&#xA;14-SEP-10 07.30.03.394       10.953&#xD;&#xA;14-SEP-10 08.15.15.243       57.833&#xD;&#xA;14-SEP-10 09.00.27.180       30.177&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;So, it looks like early in the morning we&amp;rsquo;re using about 200 MB/s throughput, and by about 9am somewhere around 30-50 MB/s ?&lt;/p&gt;</description>
    </item>
    <item>
      <title>RTFM? But where TF is the FM?  &gt;&gt; Offline searchable OBIEE 11g documentation</title>
      <link>https://rmoff.net/2010/09/13/rtfm-but-where-tf-is-the-fm-offline-searchable-obiee-11g-documentation/</link>
      <pubDate>Mon, 13 Sep 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/09/13/rtfm-but-where-tf-is-the-fm-offline-searchable-obiee-11g-documentation/</guid>
      <description>&lt;p&gt;I&amp;rsquo;m a geek. I like understanding things in their absolute entirety. It frustrates me to have to make presumptions or assumptions about something. I like to get down &amp;rsquo;n dirty and find out what makes things tick.&lt;/p&gt;&#xA;&lt;p&gt;So that necessitates reading manuals. And unfortunately, Oracle don&amp;rsquo;t make that easy all the time.&lt;/p&gt;&#xA;&lt;p&gt;Some of the manuals &lt;a href=&#34;https://rmoff.net/2009/11/27/i-think-this-summarises-everything/&#34;&gt;are not very good&lt;/a&gt;, and others are difficult to find. Given the complexity of the OBIEE stack and proliferation of terminology and product names, sorting the wheat from the chaff can be a headache. Fair enough, for a corporation as big as Oracle with a product portfolio in the [hundreds? thousands?] creating a unified, comprehensive, easy-to-navigate point of reference for documentation must be near on impossible - but that&amp;rsquo;s their problem to figure out and mine to bitch about until it&amp;rsquo;s done ;-)&lt;/p&gt;</description>
    </item>
    <item>
      <title>A fair bite of the CPU pie? Monitoring &amp; Testing Oracle Resource Manager</title>
      <link>https://rmoff.net/2010/09/10/a-fair-bite-of-the-cpu-pie-monitoring-testing-oracle-resource-manager/</link>
      <pubDate>Fri, 10 Sep 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/09/10/a-fair-bite-of-the-cpu-pie-monitoring-testing-oracle-resource-manager/</guid>
      <description>&lt;h2 id=&#34;introduction&#34;&gt;Introduction&lt;/h2&gt;&#xA;&lt;p&gt;We&amp;rsquo;re in the process of implemention &lt;a href=&#34;http://download.oracle.com/docs/cd/B28359_01/server.111/b28310/dbrm.htm#i1010776&#34;&gt;Resource Manager&lt;/a&gt; (RM) on our Oracle 11gR1 Data Warehouse. We&amp;rsquo;ve currently got one DW application live, but have several more imminent. We identified RM as a suitable way of - as the name would suggest - managing the resources on the server.&lt;/p&gt;&#xA;&lt;p&gt;In the first instance we&amp;rsquo;re looking at simply protecting CPU for, and from, future applications. At some point it would be interesting to use some of the more granular and precise functions to demote long-running queries, have nighttime/daytime plans, etc. I&amp;rsquo;d also like to explore the &lt;a href=&#34;http://www.oracle-base.com/articles/11g/ResourceManagerEnhancements_11gR1.php#per_session_io_limits&#34;&gt;management of IO&lt;/a&gt; but for us the pain is in bandwidth that a query consumes, and it looks like RM can only work with total session MB, or IOPS. Reading about Exadata it sounds like the Exadata I/O Resource Management might do this ([&amp;hellip;]&lt;a href=&#34;http://blogs.sun.com/Samson/entry/oracle_exadata_storage_server_a&#34;&gt;It allows intra and inter-database I/O bandwidth to be defined and managed&lt;/a&gt;[..]). But for that I&amp;rsquo;ll have to write to Santa and promise to be a good boy this year.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Misbehaving Informatica kills Oracle</title>
      <link>https://rmoff.net/2010/09/02/misbehaving-informatica-kills-oracle/</link>
      <pubDate>Thu, 02 Sep 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/09/02/misbehaving-informatica-kills-oracle/</guid>
      <description>&lt;p&gt;This problem, which in essence is bad behaviour from Informatica bringing down Oracle, is a good illustration of unintended consequences of an apparently innocuous security setting. Per our company&amp;rsquo;s security standards, database passwords expire every 90 days. When this happens users are prompted to change their password before they can continue logging into Oracle. This applies to application user IDs too. It appears that Informatica 8.6.1 HF6 (part of OBIA 7.9.6.1) doesn&amp;rsquo;t handle an expired password well, spawning multiple connections to the database, eventually bringing Oracle down through memory SWAP space exhaustion.&lt;/p&gt;</description>
    </item>
    <item>
      <title>BI Publisher - error creating Quartz tables</title>
      <link>https://rmoff.net/2010/08/25/bi-publisher-error-creating-quartz-tables/</link>
      <pubDate>Wed, 25 Aug 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/08/25/bi-publisher-error-creating-quartz-tables/</guid>
      <description>&lt;p&gt;A very short blog post to break the drought, but I didn&amp;rsquo;t hit any google results for this error so thought it might be useful to record it.&lt;/p&gt;&#xA;&lt;p&gt;In BI Publisher 10.1.3.4, trying to install the Scheduler (Quartz) schema, I got this error:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;Schema installation failed while creating tables. Schema may already exist. Please remove the existing schema or choose another database and try again.&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;To me, the error text is a bit unhelpful. Whilst the first statement is correct - &amp;ldquo;Schema installation failed while creating tables&amp;rdquo;, it doesn&amp;rsquo;t tell you the error it encountered, and then it goes on to suggest only one reason for the failure.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Measuring real user response times for OBIEE</title>
      <link>https://rmoff.net/2010/06/14/measuring-real-user-response-times-for-obiee/</link>
      <pubDate>Mon, 14 Jun 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/06/14/measuring-real-user-response-times-for-obiee/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;http://twitter.com/alexgorbachev&#34;&gt;@alexgorbachev&lt;/a&gt; &lt;a href=&#34;http://www.bettween.com/rnm1978/alexgorbachev/Jun-11-2010/Jun-14-2010/desc&#34;&gt;tweeted me&lt;/a&gt; recently after picking up my presentation on &lt;a href=&#34;https://rmoff.net/2010/05/24/performance-testing-and-obiee/&#34;&gt;Performance Testing and OBIEE&lt;/a&gt;. &lt;img src=&#34;https://rmoff.net/images/rnm1978/2010-06-14_1115331.png&#34; alt=&#34;&#34; title=&#34;2010-06-14_111533&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;His question got me thinking, and as ever the answer &amp;ldquo;It Depends&amp;rdquo; is appropriate here :-)&lt;/p&gt;&#xA;&lt;h2 id=&#34;why-is-the-measurement-being-done&#34;&gt;Why is the measurement being done?&lt;/h2&gt;&#xA;&lt;p&gt;Without knowing the context of the work Alex is doing, how to measure depends on whether the measurement needs to be of: -&lt;/p&gt;&#xA;&lt;ol&gt;&#xA;&lt;li&gt;The actual response times that the users are getting, &lt;strong&gt;or&lt;/strong&gt;&lt;/li&gt;&#xA;&lt;li&gt;The response times that the system is currently capable of delivering&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;p&gt;This may sound like splitting hairs or beard-scratching irrelevance, but it&amp;rsquo;s not. If the aim of the exercise is to be able to make a statement along the lines of:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Scripts to extract information from OBIEE NQQuery.log</title>
      <link>https://rmoff.net/2010/06/11/scripts-obiee-nqquery-log/</link>
      <pubDate>Fri, 11 Jun 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/06/11/scripts-obiee-nqquery-log/</guid>
      <description>&lt;p&gt;Here are a couple of little unix scripts that I wrote whilst developing my &lt;a href=&#34;https://rmoff.net/2010/05/24/performance-testing-and-obiee/&#34;&gt;performance testing OBIEE method&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;They&amp;rsquo;re nothing particularly special, but may save you the couple of minutes it&amp;rsquo;d take to write them :)&lt;/p&gt;&#xA;&lt;p&gt;Note that some of this data is available from Usage Tracking and where it is I&amp;rsquo;d recommend getting it from there, databases generally being easier to reliably and repeatably query than a transient log file.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE 11g launch date - 7th July 2010</title>
      <link>https://rmoff.net/2010/06/03/obiee-11g-launch-date-7th-july-2010/</link>
      <pubDate>Thu, 03 Jun 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/06/03/obiee-11g-launch-date-7th-july-2010/</guid>
      <description>&lt;p&gt;OBIEE 11g is going to be officially launched on 7th July (this year!) in London: &lt;a href=&#34;http://www.oracle.com/webapps/events/EventsDetail.jsp?p_eventId=113706&amp;amp;src=7036704&amp;amp;src=7036704&amp;amp;Act=9&#34;&gt;Launch Event: Introducing Oracle Business Intelligence Enterprise Edition 11g&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;h/t to &lt;a href=&#34;http://www.google.co.uk/search?q=obiee+11g&amp;amp;num=30&amp;amp;hl=en&amp;amp;safe=off&amp;amp;client=firefox-a&amp;amp;hs=ohQ&amp;amp;rls=org.mozilla:en-US:official&amp;amp;prmd=v&amp;amp;source=lnms&amp;amp;tbs=mbl:1&amp;amp;ei=5WEHTP6JI5P00gS9rqVp&amp;amp;sa=X&amp;amp;oi=mode_link&amp;amp;ct=mode&amp;amp;ved=0CBUQ_AU&amp;amp;prmdo=1&#34;&gt;Twitter&lt;/a&gt; and &lt;a href=&#34;http://blogs.oracle.com/bi/2010/06/into_orbit_obiee_11g_launch.html&#34;&gt;blogs&lt;/a&gt; (including an Oracle-branded one) this morning. Good news travels fast! :-)&lt;/p&gt;&#xA;&lt;p&gt;Looks like John Minkjan won &lt;a href=&#34;http://obiee101.blogspot.com/p/obiee-11g-ga-bet.html&#34;&gt;the bet&lt;/a&gt;, did he know something we didn&amp;rsquo;t? ;-) (although is &amp;ldquo;Launch&amp;rdquo; the same as &amp;ldquo;GA&amp;rdquo;?)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Performance Testing and OBIEE</title>
      <link>https://rmoff.net/2010/05/24/performance-testing-and-obiee/</link>
      <pubDate>Mon, 24 May 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/05/24/performance-testing-and-obiee/</guid>
      <description>&lt;p&gt;Here&amp;rsquo;s my presentation &amp;ldquo;Performance Testing and OBIEE&amp;rdquo; that I gave at the &lt;a href=&#34;https://rmoff.net/2010/05/21/rittmanmead-bi-forum-2010/&#34;&gt;RittmanMead BI Forum 2010&lt;/a&gt;: &lt;strong&gt;&lt;a href=&#34;https://talks.rmoff.net/performance-testing-and-obiee-sd/&#34;&gt;Performance Testing and OBIEE.pptx&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;p&gt;It&amp;rsquo;s a Powerpoint 2007 file (pptx) for which you may need the &lt;a href=&#34;http://www.microsoft.com/downloads/details.aspx?familyid=048dc840-14e1-467d-8dca-19d2a8fd7485&amp;amp;displaylang=en&#34;&gt;Microsoft PowerPoint Viewer 2007&lt;/a&gt;. I&amp;rsquo;ve included copious notes on each slide which hopefully cover the gist of what I talked about when I was delivering it. There are also a handful of funky animations which is why I&amp;rsquo;ve left it in pptx and not exported to PDF or other format (sorry Open Office users).&lt;/p&gt;</description>
    </item>
    <item>
      <title>My first presentation - afterthoughts</title>
      <link>https://rmoff.net/2010/05/23/my-first-presentation-afterthoughts/</link>
      <pubDate>Sun, 23 May 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/05/23/my-first-presentation-afterthoughts/</guid>
      <description>&lt;p&gt;I delivered my first presentation today, at the &lt;a href=&#34;http://www.rittmanmead.com/biforum2010/&#34;&gt;RittmanMead BI Forum&lt;/a&gt;.I was really nervous in the hours and minutes leading up to it, but once I got up there and started talking I actually quite enjoyed it. If you were in the audience, I&amp;rsquo;d love some feedback in the comments section below, particularly any &amp;ldquo;constructive criticism&amp;rdquo;. I obviously didn&amp;rsquo;t make too much of a mess of it, as I was awarded &amp;ldquo;best speaker&amp;rdquo; of the event, which was a great honour. Hopefully I&amp;rsquo;ll get the opportunity to present again soon, perhaps at an UKOUG event.&lt;/p&gt;</description>
    </item>
    <item>
      <title>RittmanMead BI Forum 2010</title>
      <link>https://rmoff.net/2010/05/21/rittmanmead-bi-forum-2010/</link>
      <pubDate>Fri, 21 May 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/05/21/rittmanmead-bi-forum-2010/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve just returned from the &lt;a href=&#34;http://www.rittmanmead.com/biforum2010/&#34;&gt;RittmanMead 2010 BI Forum&lt;/a&gt; which this year was at the Seattle Hotel in Brighton Marina.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://rmoff.net/images/2010/05/from-iphone-20100521-021.webp&#34;&gt;&lt;img src=&#34;https://rmoff.net/images/2010/05/from-iphone-20100521-021.webp&#34; alt=&#34;&#34; title=&#34;from iphone 20100521 021&#34;&gt;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;The event was limited to 50 attendees, and I think this was a good number. The event was explicitly pitched at a very technical expert level, and the audience very much represented this. Whereas at a larger conference you may find the odd manager wandering around pretending to be technical ;-) this event was most definitely not pitched at such types.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Validating EBS-BI authentication, without BI</title>
      <link>https://rmoff.net/2010/05/17/validating-ebs-bi-authentication-without-bi/</link>
      <pubDate>Mon, 17 May 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/05/17/validating-ebs-bi-authentication-without-bi/</guid>
      <description>&lt;p&gt;Troubleshooting EBS-BI integrated authentication can be a tiresome activity, so here&amp;rsquo;s a shortcut that might help. If you suspect the problem lies with EBS then you can leave OBIEE out of the equation.&lt;/p&gt;&#xA;&lt;ol&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Login to EBS&lt;br&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/rnm1978/2010-05-17_1155081.png&#34; alt=&#34;&#34; title=&#34;2010-05-17_115508&#34;&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Use &lt;a href=&#34;https://addons.mozilla.org/en-US/firefox/addon/1843/&#34;&gt;FireBug&lt;/a&gt; or &lt;a href=&#34;http://www.fiddler2.com/fiddler2/&#34;&gt;Fiddler2&lt;/a&gt; to inspect web traffic as follows:&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Click the BI link from EBS&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Should be first a request to EBS server, which returns 302 and redirects to &lt;code&gt;http://&amp;lt;bi server&amp;gt;:&amp;lt;port&amp;gt;/analytics/saw.dll?Dashboard&amp;amp;acf=101507310&lt;/code&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Record the value of acf (eg &lt;code&gt;101507310&lt;/code&gt;)&lt;br&gt;&#xA;&lt;img src=&#34;https://rmoff.net/images/rnm1978/2010-05-17_1201361.png&#34; alt=&#34;&#34; title=&#34;2010-05-17_120136&#34;&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>What am I missing here??? ORA-01017: invalid username/password; logon denied</title>
      <link>https://rmoff.net/2010/05/06/what-am-i-missing-here-ora-01017-invalid-usernamepassword-logon-denied/</link>
      <pubDate>Thu, 06 May 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/05/06/what-am-i-missing-here-ora-01017-invalid-usernamepassword-logon-denied/</guid>
      <description>&lt;p&gt;What&amp;rsquo;s going on here? The username/password is definitely valid, proved by the sqlplus connection.&lt;/p&gt;&#xA;&lt;p&gt;Configuring DAC in OBIA 7.9.5.1:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;What can I do for you?&#xD;&#xA;&#xD;&#xA;1 - Enter repository connection information&#xD;&#xA;2 - Test repository connection&#xD;&#xA;3 - Enter email account information&#xD;&#xA;4 - Send test email&#xD;&#xA;5 - Save changes&#xD;&#xA;6 - Exit&#xD;&#xA;&#xD;&#xA;Please make your selection: 1&#xD;&#xA;&#xD;&#xA;These are your connection type choices:&#xD;&#xA;&#xD;&#xA;1 - MSSQL&#xD;&#xA;2 - DB2&#xD;&#xA;3 - Oracle (OCI8)&#xD;&#xA;4 - Oracle (Thin)&#xD;&#xA;5 - Keep current ( Oracle (Thin) )&#xD;&#xA;&#xD;&#xA;Please make your selection: 4&#xD;&#xA;&#xD;&#xA;Current value for Instance is MYDB.&#xD;&#xA;Press return to keep it or enter a new value.&#xD;&#xA;&amp;gt; MYDB&#xD;&#xA;&#xD;&#xA;Current value for Database Host is server.company.com.&#xD;&#xA;Press return to keep it or enter a new value.&#xD;&#xA;&amp;gt; server.company.com&#xD;&#xA;&#xD;&#xA;Current value for Database Port is 1521.&#xD;&#xA;Press return to keep it or enter a new value.&#xD;&#xA;&amp;gt; 1521&#xD;&#xA;&#xD;&#xA;Current value for Table owner name is DAC_REPO_795.&#xD;&#xA;Press return to keep it or enter a new value.&#xD;&#xA;&amp;gt; DAC_REPO_795&#xD;&#xA;&#xD;&#xA;Press return to keep current password, enter a new value otherwise.&#xD;&#xA;&amp;gt; HAS425Al&#xD;&#xA;&#xD;&#xA;What can I do for you?&#xD;&#xA;&#xD;&#xA;1 - Enter repository connection information&#xD;&#xA;2 - Test repository connection&#xD;&#xA;3 - Enter email account information&#xD;&#xA;4 - Send test email&#xD;&#xA;5 - Save changes&#xD;&#xA;6 - Exit&#xD;&#xA;&#xD;&#xA;Please make your selection: 2&#xD;&#xA;&#xD;&#xA;Connecting to repository...&#xD;&#xA;Can&amp;#39;t connect to the database.&#xD;&#xA;ORA-01017: invalid username/password; logon denied&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;Validate connectivity with SQLPLUS:&lt;/p&gt;</description>
    </item>
    <item>
      <title>RTFAL!</title>
      <link>https://rmoff.net/2010/04/24/rtfal/</link>
      <pubDate>Sat, 24 Apr 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/04/24/rtfal/</guid>
      <description>&lt;p&gt;This is a note-to-self really. When playing around with Oracle and something&amp;rsquo;s not working - RTFAL: Read The Flippin Alert Log!&lt;/p&gt;&#xA;&lt;p&gt;After resizing a VM I was getting this problem:&lt;/p&gt;&#xA;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;[oracle@RNMVM01 ~]$ sqlplus / as sysdba&#xD;&#xA;&#xD;&#xA;SQL*Plus: Release 11.2.0.1.0 Production on Sat Apr 24 17:44:44 2010&#xD;&#xA;&#xD;&#xA;Copyright (c) 1982, 2009, Oracle.  All rights reserved.&#xD;&#xA;&#xD;&#xA;Connected to an idle instance.&#xD;&#xA;&#xD;&#xA;SQL&amp;gt; startup nomount&#xD;&#xA;ORA-00845: MEMORY_TARGET not supported on this system&#xD;&#xA;SQL&amp;gt;&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;I spent longer than I should have reading around on google, hitting various pages which all talked around memory management and /dev/shm&lt;/p&gt;</description>
    </item>
    <item>
      <title>Opera &#43; Oracle EM = true love</title>
      <link>https://rmoff.net/2010/04/22/opera-oracle-em-true-love/</link>
      <pubDate>Thu, 22 Apr 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/04/22/opera-oracle-em-true-love/</guid>
      <description>&lt;p&gt;A faithful FireFox user for many users, I&amp;rsquo;ve been having a rather delightful rekindling of my old love for &lt;a href=&#34;http://www.opera.com/&#34;&gt;Opera&lt;/a&gt; recently. Back in the days, when I was a beardless bairn, I even paid for Opera I liked it so much. Then &lt;a href=&#34;http://www.mozilla-europe.org/en/firefox/&#34;&gt;Firefox&lt;/a&gt; came along and Opera got dropped by the wayside like a teenage crush.&lt;/p&gt;&#xA;&lt;p&gt;Well Opera 10.5 was released recently and I&amp;rsquo;m liking it. I miss my Firefox extensions too much to switch entirely, but damn, Opera is FAST!!&lt;/p&gt;</description>
    </item>
    <item>
      <title>My first presentation - help!</title>
      <link>https://rmoff.net/2010/04/21/my-first-presentation-help/</link>
      <pubDate>Wed, 21 Apr 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/04/21/my-first-presentation-help/</guid>
      <description>&lt;p&gt;I&amp;rsquo;m doing my first ever conference presentation next month at the &lt;a href=&#34;http://www.rittmanmead.com/biforum2010/&#34;&gt;2010 Rittman Mead BI Forum&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;My presentation is called &lt;strong&gt;Performance Testing OBIEE&lt;/strong&gt;, which is something I&amp;rsquo;ve spent a lot of time working on over the last few months. I think the challenge is going to be distilling it all into a session that&amp;rsquo;s not going to overwhelm everyone or bore them to death! Well, actually, the challenge is going to be the presenting. I can talk geek one-on-one, but talking to a whole bunch of people, not wittering but staying focussed, holding their attention&amp;hellip;.uh oh.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE 11g tidbit - XUDML support</title>
      <link>https://rmoff.net/2010/03/18/obiee-11g-tidbit-xudml-support/</link>
      <pubDate>Thu, 18 Mar 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/03/18/obiee-11g-tidbit-xudml-support/</guid>
      <description>&lt;p&gt;Spotted this when trawling through My Oracle Support. It&amp;rsquo;s pretty common knowledge anyway amongst people already familiar with hacking around with OBIEE, but worth recording for people coming along to it new.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;https://supporthtml.oracle.com/ep/faces/secure/km/DocumentDisplay.jspx?id=1068266.1&#34;&gt;Doc ID 1068266.1&lt;/a&gt; states:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;UDML is not supported in OBI 10g.&lt;/p&gt;&#xA;&lt;p&gt;in 11g, XUDML (the Oracle BI Server XML API) will be fully supported and documented.&lt;/p&gt;&#xA;&lt;/blockquote&gt;</description>
    </item>
    <item>
      <title>ORA-13757: &#34;SQL Tuning Set&#34; &#34;string&#34; owned by user &#34;string&#34; is active.</title>
      <link>https://rmoff.net/2010/03/09/ora-13757-sql-tuning-set-string-owned-by-user-string-is-active/</link>
      <pubDate>Tue, 09 Mar 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/03/09/ora-13757-sql-tuning-set-string-owned-by-user-string-is-active/</guid>
      <description>&lt;p&gt;I&amp;rsquo;ve been playing around with &lt;a href=&#34;http://download.oracle.com/docs/cd/B28359_01/server.111/b28274/sql_tune.htm#i34915&#34;&gt;SQL Tuning Sets&lt;/a&gt;, and was trying to clear up my mess.&lt;/p&gt;&#xA;&lt;p&gt;To list all the tuning sets:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-sql&#34; data-lang=&#34;sql&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;SET&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;WRAP&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;OFF&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;SET&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;LINE&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;140&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;COL&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;NAME&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;FOR&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;A15&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;COL&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;DESCRIPTION&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;FOR&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;A50&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;WRAPPED&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;select&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;name,created,last_modified,statement_count,description&lt;span style=&#34;color:#bbb&#34;&gt; &#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;from&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;DBA_SQLSET&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;NAME            CREATED   LAST_MODI STATEMENT_COUNT DESCRIPTION&#xD;&#xA;--------------- --------- --------- --------------- ----------------------------------------------------------------------------------------&#xD;&#xA;sts_test_02     09-MAR-10 09-MAR-10               1 Test run 1&#xD;&#xA;sts_test_01     12-FEB-10 12-FEB-10               1 an old STS test test test&#xA;&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;To delete a tuning set:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Securing OBIEE Systems Management JMX for remote access</title>
      <link>https://rmoff.net/2010/03/05/securing-obiee-systems-management-jmx-for-remote-access/</link>
      <pubDate>Fri, 05 Mar 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/03/05/securing-obiee-systems-management-jmx-for-remote-access/</guid>
      <description>&lt;h2 id=&#34;jmx&#34;&gt;JMX&lt;/h2&gt;&#xA;&lt;p&gt;OBIEE&amp;rsquo;s Systems Management functionality exposes performance counters and the application&amp;rsquo;s configuration options through Java MBeans and optionally a protocol called JMX.&lt;/p&gt;&#xA;&lt;p&gt;It&amp;rsquo;s extremely useful, and is documented pretty widely :&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2009/07/16/jconsole-jmx/&#34;&gt;JConsole / JMX&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2009/07/21/jconsole-jmx-followup/&#34;&gt;JConsole / JMX – followup&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2009/07/22/oracle-bi-management-systems-management-mbeans/&#34;&gt;Oracle BI Management / Systems Management MBeans&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://obiee101.blogspot.com/2009/07/obiee-perfmon-performance-monitor.html&#34;&gt;PerfMon&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://blogs.oracle.com/siebelessentials/2008/11/oracle_bi_ee_and_mbeans.html&#34;&gt;OBIEE MBeans and OC4J&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2009/07/29/oracle-bi-management-jmanage/&#34;&gt;OBIEE performance monitoring and alerting with jManage&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;In this article I&amp;rsquo;m going to discuss the use of JMX to access these counters remotely, and a possible security issue that&amp;rsquo;s present in the &lt;a href=&#34;http://docs.google.com/viewer?a=v&amp;amp;q=cache:cBH-0QJHbTEJ:download.oracle.com/docs/cd/B16240_01/doc/apirefs.102/e12639.pdf+com.sun.management.jmxremote.authenticate%3Dfalse&amp;amp;hl=en&amp;amp;gl=uk&amp;amp;pid=bl&amp;amp;srcid=ADGEESiWEE9yb6LNERALgxwRhxGkUPC_8VzSZcYiyFUbV2MMMcP0RniO8EcSgT8Y2VsihL8JwLtTQHBuEDAQhS0FOOGfRKt9AxGdnbZEBalywMSEQoyzrktNU1ppcvLgB-F2Hjcr6gLI&amp;amp;sig=AHIEtbTc_xYSdrrFG4k-rsCaJrd4ZJjodQ&#34;&gt;BI Management Pack&lt;/a&gt; manual. The BI Management Pack is an add-on to Oracle&amp;rsquo;s Enterprise Manager / Grid Control for managing OBIEE, see &lt;a href=&#34;http://www.oracle.com/technology/pub/articles/rittman-oem-bipack.html&#34;&gt;Mark Rittman&amp;rsquo;s excellent guide on Oracle&amp;rsquo;s website&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Who&#39;s been at the cookie jar? EBS-BI authentication and Load Balancers</title>
      <link>https://rmoff.net/2010/03/05/whos-been-at-the-cookie-jar-ebs-bi-authentication-and-load-balancers/</link>
      <pubDate>Fri, 05 Mar 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/03/05/whos-been-at-the-cookie-jar-ebs-bi-authentication-and-load-balancers/</guid>
      <description>&lt;p&gt;We hit a very interesting problem in our Production environment recently. We&amp;rsquo;d made no changes for a long time to the configuration, but all of a sudden users were on the phone complaining. They could login to BI from EBS but after logging in the next link they clicked took them to the OBIEE &amp;ldquo;You are not logged in&amp;rdquo; screen.&lt;/p&gt;&#xA;&lt;p&gt;Our users login to EBS R12 and then using EBS authentication log in to OBIEE (10.1.3.4). Our OBIEE is deployed on OAS, load balanced across two servers by an F5 BIG-IP hardware load balancer.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIA 7.9.6 Performance Recommendations</title>
      <link>https://rmoff.net/2010/03/02/obia-7-9-6-performance-recommendations/</link>
      <pubDate>Tue, 02 Mar 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/03/02/obia-7-9-6-performance-recommendations/</guid>
      <description>&lt;p&gt;A new document has been published by Oracle, discussing ways of improving performance for OBIA 7.9.6 and 7.9.6.1. Its primary focus is around improving ETL performance. There&amp;rsquo;s some very interesting content including hardware sizing recommendations, and I&amp;rsquo;d strongly recommend anyone working with OBIA reads it.&lt;/p&gt;&#xA;&lt;p&gt;It&amp;rsquo;s called &amp;ldquo;&lt;strong&gt;Oracle Business Intelligence Applications Version 7.9.6 Performance Recommendations&lt;/strong&gt;&amp;rdquo; and is available on My Oracle Support through &lt;a href=&#34;https://supporthtml.oracle.com/ep/faces/secure/km/DocumentDisplay.jspx?id=870314.1&#34;&gt;Doc ID 870314.1&lt;/a&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Oracle Support Blog back online, kinda.</title>
      <link>https://rmoff.net/2010/02/23/oracle-support-blog-back-online-kinda/</link>
      <pubDate>Tue, 23 Feb 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/02/23/oracle-support-blog-back-online-kinda/</guid>
      <description>&lt;p&gt;After it was &lt;a href=&#34;https://rmoff.net/2010/02/15/oracle-support-blog-no-more/&#34;&gt;ripped down last week&lt;/a&gt;, Chris Warticki&amp;rsquo;s blog is &lt;a href=&#34;http://blogs.oracle.com/Support/2010/02/support_blog_moved_to_my_oracl.html&#34;&gt;back online&lt;/a&gt;, albeit with the last posting redirecting visitors to a &lt;a href=&#34;https://communities.oracle.com/portal/server.pt/community/using_my_oracle_support/221&#34;&gt;new location on Oracle Communities&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;Maybe I&amp;rsquo;m getting too old for this s##t, but I&amp;rsquo;m yet to really get a handle on how Oracle want to interact with real people on the ground. &lt;a href=&#34;https://communities.oracle.com/&#34;&gt;Oracle Communities&lt;/a&gt; is a fairly new site that I&amp;rsquo;ve not explored so much because of no OBIEE area. &lt;a href=&#34;http://support.oracle.com&#34;&gt;My Oracle Support&lt;/a&gt; has feedback and comment sections, but in general usability stinks. A support blog is torn down and then reappears. &lt;a href=&#34;http://forums.oracle.com&#34;&gt;Oracle Forums&lt;/a&gt; are popular as ever but the software blows and some forums (eg OBIEE) are withering under a very high noise to quality-content ratio. Oh, and &lt;a href=&#34;https://mix.oracle.com&#34;&gt;Oracle Mix&lt;/a&gt; (&amp;ldquo;Bringing Oracle Customers, Employees, and Developers together&amp;rdquo;) too, which looked quite neat but I have no idea how that&amp;rsquo;s supposed to fit it the picture.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Oracle Support blog no more</title>
      <link>https://rmoff.net/2010/02/15/oracle-support-blog-no-more/</link>
      <pubDate>Mon, 15 Feb 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/02/15/oracle-support-blog-no-more/</guid>
      <description>&lt;p&gt;A sad little passing last week, of the &lt;a href=&#34;http://blogs.oracle.com/Support/&#34;&gt;Oracle Support Blog&lt;/a&gt; and &lt;a href=&#34;http://twitter.com/cwarticki&#34;&gt;related tweets&lt;/a&gt; by Chris Warticki.&lt;/p&gt;&#xA;&lt;p&gt;Last week Chris posted this comment on twitter&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;em&gt;&lt;a href=&#34;http://www.twibes.com/group/customer-experience/tweet/9011049708&#34;&gt;&amp;ldquo;So, what to do if you&amp;rsquo;re the &amp;ldquo;online customer presence&amp;rdquo; and your own leadership wants to censor your posts and comments?&amp;rdquo;&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;followed by this terse blog posting:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;strong&gt;Support Blog: No longer available&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;p&gt;By chris.warticki on February 12, 2010 4:08 PM Please use My Oracle Support Communities instead&lt;/p&gt;</description>
    </item>
    <item>
      <title>Exadata V2 POC numbers</title>
      <link>https://rmoff.net/2010/02/10/exadata-v2-poc-numbers/</link>
      <pubDate>Wed, 10 Feb 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/02/10/exadata-v2-poc-numbers/</guid>
      <description>&lt;p&gt;I stumbled across this blog posting recently, and am reposting the link here as I&amp;rsquo;ve seen no mention of it elsewhere. This surprised me as with Exadata (and most new technologies) any snippet of news or tech insight gets blogged and tweeted multiple times over.&lt;/p&gt;&#xA;&lt;p&gt;Any insight into Exadata is interesting as unlike all the other Oracle software which can be downloaded and poked &amp;amp; prodded, this is a physical box so we rely on blog postings and marketing (with a BS filter) to understand more about it.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Illustrating data</title>
      <link>https://rmoff.net/2010/02/08/illlustrating-data/</link>
      <pubDate>Mon, 08 Feb 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/02/08/illlustrating-data/</guid>
      <description>&lt;p&gt;In the following list, which two &lt;a href=&#34;http://lifehacker.com/5188833/hive-five-five-best-mind-mapping-applications&#34;&gt;mind-mapping programs&lt;/a&gt; are rated best?&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/2010-02-08_114400.png&#34; alt=&#34;&#34; title=&#34;2010-02-08_114400&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;Now look at the actual numbers, and answer again &lt;img src=&#34;https://rmoff.net/images/rnm1978/2010-02-08_114400.png&#34; alt=&#34;&#34; title=&#34;2010-02-08_114400&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;Different answer?&lt;/p&gt;&#xA;&lt;p&gt;I can&amp;rsquo;t be the only one in this frantic world whose eyes are drawn to the pictures instead of words and leap to conclusions. It&amp;rsquo;s only because I use &lt;a href=&#34;http://freemind.sourceforge.net/wiki/index.php/Main_Page&#34;&gt;FreeMind&lt;/a&gt; and was surprised it scored so low &amp;hellip;. and then realised it hadn&amp;rsquo;t. Looks like the HTML rendering isn&amp;rsquo;t the same here (FF3.6) as when the web page author wrote it.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Brilliant performance articles by Cary Millsap</title>
      <link>https://rmoff.net/2010/01/29/brilliant-performance-articles-by-cary-millsap/</link>
      <pubDate>Fri, 29 Jan 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/01/29/brilliant-performance-articles-by-cary-millsap/</guid>
      <description>&lt;p&gt;There is a LOT written about performance. And this post is now adding to it. Some of it&amp;rsquo;s excellent, some of it less so. But a lot of it starts from a point so far down the process that unless you know the first bit, you&amp;rsquo;re going to raring off and end up chasing your tail or p###ing in the wind&amp;hellip;. (pardon my french). Without a well structured approach that you understand and always follow you&amp;rsquo;ll hit on solutions by luck only.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Identify your OBIEE users by setting Client ID in Oracle connection</title>
      <link>https://rmoff.net/2010/01/26/identify-your-users-by-setting-client-id-in-oracle/</link>
      <pubDate>Tue, 26 Jan 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/01/26/identify-your-users-by-setting-client-id-in-oracle/</guid>
      <description>&lt;p&gt;You get a call from your friendly DBA. He says the production database is up the spout, and it&amp;rsquo;s &amp;ldquo;that bee eye thingumy causing it&amp;rdquo;. What do you do now? All you&amp;rsquo;ve got to go on is a program name in the Oracle session tables of &amp;ldquo;nqsserver@MYSERVER (TNS V1-V3)&amp;rdquo; and the SQL the DBA sent you that if you&amp;rsquo;re lucky will look as presentable as this: &lt;img src=&#34;https://rmoff.net/images/rnm1978/2010-01-25_145658.png&#34; alt=&#34;&#34; title=&#34;2010-01-25_145658&#34;&gt; The username against the SQL is the generic User ID that you had created for connections to the database from OBIEE.&lt;/p&gt;</description>
    </item>
    <item>
      <title>How to resolve &#34;[nQSError: 12002] Socket communication error at call=: (Number=-1) Unknown&#34;</title>
      <link>https://rmoff.net/2010/01/22/how-to-resolve-nqserror-12002-socket-communication-error-at-call-number-1-unknown/</link>
      <pubDate>Fri, 22 Jan 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/01/22/how-to-resolve-nqserror-12002-socket-communication-error-at-call-number-1-unknown/</guid>
      <description>&lt;p&gt;This error caught me out today. I was building a Linux VM to do some work on, and for the life of me couldn&amp;rsquo;t get the OBIEE Admin Tool to connect to the BI Server on the VM.&lt;/p&gt;&#xA;&lt;p&gt;The error I got when trying to define a DSN on the Windows box was:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;[nQSError: 12008] Unable to connect to port 9703 on machine 10.3.105.132 [nQSError: 12010] Communication error connecting to remote end point: address = 10.3.105.132; port = 9703. [nQSError: 12002] Socket communication error at call=: (Number=-1) Unknown&lt;/p&gt;</description>
    </item>
    <item>
      <title>Hardening OAS</title>
      <link>https://rmoff.net/2010/01/21/hardening-oas/</link>
      <pubDate>Thu, 21 Jan 2010 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2010/01/21/hardening-oas/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://rmoff.net/categories/oas/&#34;&gt;Oracle Application Server&lt;/a&gt; (OAS) is the Web and Application server typically deployed with OBIEE. There are several settings which by default may be viewed as security weaknesses. Whether realistically a target or not, it&amp;rsquo;s good practice to always be considering security and lock down your servers as much as reasonably possible. I adopt the default stance of having to find a reason to leave something less secure, rather than justify why it needs doing.&lt;/p&gt;</description>
    </item>
    <item>
      <title>libnnz10.so: cannot restore segment prot after reloc: Permission denied</title>
      <link>https://rmoff.net/2009/12/18/libnnz10-so-cannot-restore-segment-prot-after-reloc-permission-denied/</link>
      <pubDate>Fri, 18 Dec 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/12/18/libnnz10-so-cannot-restore-segment-prot-after-reloc-permission-denied/</guid>
      <description>&lt;p&gt;Quick post as the snow&amp;rsquo;s coming down and I wanna go home &amp;hellip;&lt;/p&gt;&#xA;&lt;p&gt;I&amp;rsquo;ve been working on building a VM based on OEL5.4 and OBIEE 10.1.3.4.1. After installing XE 10.2 I tried to fire my RPD up, but hit this:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;/usr/lib/oracle/xe/app/oracle/product/10.2.0/server/lib/libnnz10.so: cannot restore segment prot after reloc: Permission denied [nQSError: 46029] Failed to load the DLL /app/oracle/product/obiee/server/Bin/libnqsdbgatewayoci10g.so. Check if &amp;lsquo;Oracle OCI 10G&amp;rsquo; database client is installed.&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;If you trace the &amp;lsquo;stack&amp;rsquo; back you find that it parses down to this nub of an error:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Running the OBIEE admin tool on Unix</title>
      <link>https://rmoff.net/2009/12/14/running-the-obiee-admin-tool-on-unix/</link>
      <pubDate>Mon, 14 Dec 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/12/14/running-the-obiee-admin-tool-on-unix/</guid>
      <description>&lt;p&gt;Mucho kudos to Borkur Steingrimsson for getting &lt;a href=&#34;http://www.rittmanmead.com/2009/12/13/running-the-obiee-administratiol-tool-on-unix-using-wine/&#34;&gt;the OBIEE admin tool working on Unix&lt;/a&gt;!&lt;/p&gt;</description>
    </item>
    <item>
      <title>CAF</title>
      <link>https://rmoff.net/2009/12/11/caf/</link>
      <pubDate>Fri, 11 Dec 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/12/11/caf/</guid>
      <description>&lt;p&gt;Very interesting post by Kevin McGinley about CAF here:  &lt;a href=&#34;http://oraclebiblog.blogspot.com/2009/12/11/caf/&#34;&gt;CAF = Migration Utility? Use Caution!&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;It articulates better than I ever could reasons against using CAF particularly in a production environment.&lt;/p&gt;&#xA;&lt;p&gt;Since the tool came out I&amp;rsquo;d been struggling to get my head around it, convinced I was missing something. I still don&amp;rsquo;t profess to understand it properly, but Kevin&amp;rsquo;s article reassures me that I shouldn&amp;rsquo;t be losing too much sleep over it, especially given that it&amp;rsquo;s unsupported and &lt;a href=&#34;http://www.oracle.com/technology/obe/obe_bi/bi_ee_1013/caf/caf.html#t1&#34;&gt;won&amp;rsquo;t work with 11g&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Troubleshooting Presentation Services / analytics connectivity</title>
      <link>https://rmoff.net/2009/12/09/troubleshooting-presentation-services-analytics-connectivity/</link>
      <pubDate>Wed, 09 Dec 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/12/09/troubleshooting-presentation-services-analytics-connectivity/</guid>
      <description>&lt;p&gt;Short but sweet this one - a way of troubleshooting connectivity problems between &lt;em&gt;analytics&lt;/em&gt; (the Presentation Services Plug-in, either j2ee servlet or ISAPI, a.k.a. SAWBridge) and &lt;em&gt;sawserver&lt;/em&gt; (Presentation Services).&lt;/p&gt;&#xA;&lt;p&gt;For a recap on the services &amp;amp; flow please see the first few paragraphs of &lt;a href=&#34;https://rmoff.net/2009/11/06/obiee-clustering-specifying-multiple-presentation-services-from-presentation-services-plug-in/&#34;&gt;this post&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;Problems in connectivity between analytics and sawserver normally manifest themselves through this error message:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;500 Internal Server Error Servlet error: An exception occurred. The current application deployment descriptors do not allow for including it in this response. Please consult the application log for details.&lt;/p&gt;</description>
    </item>
    <item>
      <title>UKOUG TEBS 2009</title>
      <link>https://rmoff.net/2009/12/03/ukoug-tebs-2009/</link>
      <pubDate>Thu, 03 Dec 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/12/03/ukoug-tebs-2009/</guid>
      <description>&lt;p&gt;This was my first UKOUG TEBS, in fact my first conference I&amp;rsquo;d ever attended! I was quite unsure what to expect, but three days later I can safely say it was &lt;strong&gt;invaluable&lt;/strong&gt;.&lt;/p&gt;&#xA;&lt;p&gt;The variety of presentations and expertise being shared was impressive, and it was great to hear people sharing and discussing their ideas and opinions around the subjects I work with each day.&lt;/p&gt;&#xA;&lt;p&gt;Working in isolation is not a good idea, one can develop a blinkered or bunker mentality. Blogging is good for breaking out of this, as is, &lt;a href=&#34;http://www.oraclenerd.com/2009/10/how-to-use-social-media-to-increase.html&#34;&gt;I&amp;rsquo;m now convinced, twittering&lt;/a&gt;. However, nothing beats meeting up with folk and discussing problems face to face, listening and learning from their experience.&lt;/p&gt;</description>
    </item>
    <item>
      <title>I think this summarises everything.</title>
      <link>https://rmoff.net/2009/11/27/i-think-this-summarises-everything/</link>
      <pubDate>Fri, 27 Nov 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/11/27/i-think-this-summarises-everything/</guid>
      <description>&lt;p&gt;Why did this make me think of the OBIA upgrade documentation?? ;-)&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/tumblr_kt2eynYSP91qz4axuo1_500.jpg&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;via &lt;a href=&#34;http://mnmal.tumblr.com/post/249575328&#34;&gt;I think this summarizes everything.&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;[Update December 18, 2013: The lady in the picture is &lt;a href=&#34;https://twitter.com/seriouspony/status/413346604087640065&#34;&gt;Kathy Sierra&lt;/a&gt;]&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE application servers, now and future</title>
      <link>https://rmoff.net/2009/11/25/obiee-application-servers-now-and-future/</link>
      <pubDate>Wed, 25 Nov 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/11/25/obiee-application-servers-now-and-future/</guid>
      <description>&lt;p&gt;Oracle have published an interesting doc &lt;a href=&#34;https://supporthtml.oracle.com/ep/faces/secure/km/DocumentDisplay.jspx?id=968223.1&#34;&gt;968223.1&lt;/a&gt;, entitled &amp;ldquo;Enterprise Deployment of Oracle BI EE on OC4J and App Servers&amp;rdquo;.&lt;/p&gt;&#xA;&lt;p&gt;It details the differences between OC4J and OAS which is useful for the current versions of OBIEE. It then also gives a useful heads-up &amp;ndash; that WebLogic becomes the App server of choice in the next version of OBIEE&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;em&gt;All of this changes in OBI EE 11g where several projects will become absolutely dependent upon an App Server. Within Oracle Fusion Middleware, WebLogic Server will be the application server and &lt;strong&gt;OBI EE 11g will be deployed and certified with WebLogic Server&lt;/strong&gt;.&lt;/em&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>SoOotW and sweep</title>
      <link>https://rmoff.net/2009/11/25/soootw-and-sweep/</link>
      <pubDate>Wed, 25 Nov 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/11/25/soootw-and-sweep/</guid>
      <description>&lt;p&gt;(Sorry for the lame title, but it gives me an excuse to put this picture in :) &lt;img src=&#34;https://rmoff.net/images/rnm1978/sooty_sweep_150_150x180.jpg&#34; alt=&#34;&#34;&gt;)&lt;/p&gt;&#xA;&lt;p&gt;I was really pleased by the response I got from my posting about &lt;a href=&#34;https://rmoff.net/2009/10/30/the-state-of-obiee-on-the-web/&#34;&gt;The state of OBIEE on the web&lt;/a&gt;, knowing that it&amp;rsquo;s not just me goes a long way to keeping my blood pressure down.&lt;/p&gt;&#xA;&lt;p&gt;The OTN forum is still an unbalanced mix of tosh (&lt;a href=&#34;http://forums.oracle.com/forums/thread.jspa?threadID=992547&amp;amp;tstart=0&#34;&gt;is this guy for real?&lt;/a&gt;) with the occasional insightful and interesting post or idea such as &lt;a href=&#34;http://forums.oracle.com/forums/thread.jspa?threadID=991528&amp;amp;tstart=0&#34;&gt;this really neat one from Joe Bertram&lt;/a&gt; about using multiple presentation services on top of the same RPD to give different interfaces to different end devices.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Resolved: sawserver : Error loading security privilege /system/privs/catalog/ChangePermissionsPrivilege</title>
      <link>https://rmoff.net/2009/11/17/resolved-sawserver-error-loading-security-privilege-systemprivscatalogchangepermissionsprivilege/</link>
      <pubDate>Tue, 17 Nov 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/11/17/resolved-sawserver-error-loading-security-privilege-systemprivscatalogchangepermissionsprivilege/</guid>
      <description>&lt;p&gt;Whilst installing OBIA 7.9.6.1 I hit this problem when firing up Presentation Services (sawserver):&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;Error loading security privilege /system/privs/catalog/ChangePermissionsPrivilege.&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;A quick search on the forums threw up &lt;a href=&#34;http://forums.oracle.com/forums/thread.jspa?threadID=632090&#34;&gt;two&lt;/a&gt; &lt;a href=&#34;http://forums.oracle.com/forums/thread.jspa?threadID=938275&amp;amp;tstart=0&#34;&gt;posts&lt;/a&gt; suggesting a corrupted WebCat.&lt;/p&gt;&#xA;&lt;p&gt;Since I&amp;rsquo;d got this webcat fresh out of the box I was puzzled how it could be corrupted.&lt;/p&gt;&#xA;&lt;p&gt;I did a bit more tinkering (including &lt;a href=&#34;https://rmoff.net/categories/log/&#34;&gt;nosying around in the sawserver log&lt;/a&gt;), before realising it was indeed corrupt, and that I was indeed a muppet.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Deploying Oracle Business Intelligence Enterprise Edition on Sun Systems</title>
      <link>https://rmoff.net/2009/11/12/deploying-oracle-business-intelligence-enterprise-edition-on-sun-systems/</link>
      <pubDate>Thu, 12 Nov 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/11/12/deploying-oracle-business-intelligence-enterprise-edition-on-sun-systems/</guid>
      <description>&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/821-0698.gif&#34; alt=&#34;&#34;&gt; A very interesting new PDF from Sun on deploying OBIEE has been published, with discussions on architecture, performance and best practice.&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;This Sun BluePrints article describes an enterprise deployment architecture for Oracle Business Intelligence Enterprise Edition using Sun servers running the Solaris Operating System and Sun Storage 7000 Unified Storage systems. Designed to empower employees in organizations in any industry—from customer service, shipping, and finance to manufacturing, human resources, and more—to become potential decision makers, the architecture brings fault tolerance, security, resiliency, and performance to enterprise deployments. Taking advantage of key virtualization technologies, the architecture can be used to consolidate multiple tiers onto a single system to help reduce cost and complexity. A short discussion of the performance characteristics of the architecture using a realistic workload also is included.&lt;/p&gt;</description>
    </item>
    <item>
      <title>#Fail: My Oracle Support</title>
      <link>https://rmoff.net/2009/11/11/fail-my-oracle-support/</link>
      <pubDate>Wed, 11 Nov 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/11/11/fail-my-oracle-support/</guid>
      <description>&lt;p&gt;Metalink was retired this weekend and made way for the new My Oracle Support system. It didn&amp;rsquo;t go as smoothly as it could have done.&lt;/p&gt;&#xA;&lt;p&gt;This post is going to be a bit of a rambling rant.&lt;/p&gt;&#xA;&lt;p&gt;Ultimately people, including me, don&amp;rsquo;t like their &lt;a href=&#34;http://en.wikipedia.org/wiki/Who_Moved_My_Cheese%3F&#34;&gt;cheese being moved&lt;/a&gt; (not unless there&amp;rsquo;s a really runny piece of &lt;a href=&#34;http://en.wikipedia.org/wiki/Cheese_Shop_sketch#Table_of_Cheeses&#34;&gt;Camembert&lt;/a&gt; at the end of it). That makes it a bit more difficult to discuss because some of people&amp;rsquo;s complaints will just be geeks being stubborn (and boy, can geeks be stubborn). Arguments descend into minutiae of detail and flash vs DHTML - whilst the bigger picture gets lost.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE clustering - specifying multiple Presentation Services from Presentation Services Plug-in</title>
      <link>https://rmoff.net/2009/11/06/clustering-specifying-multiple-presentation-services-from-presentation-services-plug-in/</link>
      <pubDate>Fri, 06 Nov 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/11/06/clustering-specifying-multiple-presentation-services-from-presentation-services-plug-in/</guid>
      <description>&lt;h1 id=&#34;introduction&#34;&gt;Introduction&lt;/h1&gt;&#xA;&lt;p&gt;Whilst the BI Cluster Controller takes care nicely of clustering and failover for BI Server (nqsserver), we have to do more to ensure further resilience of the stack.&lt;/p&gt;&#xA;&lt;p&gt;A diagram I come back to again and again when working out configuration or connectivity problems is the one on P16 of the &lt;a href=&#34;http://download.oracle.com/docs/cd/E10415_01/doc/bi.1013/b40058.pdf&#34;&gt;Deployment Guide&lt;/a&gt;. With this you can work out most issues for yourself through simple reasoning. Print it out, pin it to your wall, and read it!&lt;/p&gt;</description>
    </item>
    <item>
      <title>Advanced Googling for OBIEE information</title>
      <link>https://rmoff.net/2009/11/03/advanced-googling-for-obiee-information/</link>
      <pubDate>Tue, 03 Nov 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/11/03/advanced-googling-for-obiee-information/</guid>
      <description>&lt;p&gt;Want to find all PDFs from Oracle about OBIEE?&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://www.google.com/search?q=site%3A.oracle.com+filetype%3Apdf+obiee&#34;&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/site-oracle-com-filetype-pdf-obiee-google-search_1257257786312.png&#34; alt=&#34;site-.oracle.com filetype-pdf obiee - Google Search_1257257786312&#34;&gt;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;There&amp;rsquo;s some interesting ones that this turned up on Oracle&amp;rsquo;s public FTP. In particular:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;ftp://ftp.oracle.com/sales/outgoing/bwise/config/BI%20EE%20SCM,%20Merge%20and%20Migration.pdf&#34; title=&#34;BI EE Environmental and Multi-User Development Migration Processes&#34;&gt;BI EE Environmental and Multi-User Development Migration Processes&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;ftp://ftp.oracle.com/sales/outgoing/bwise/security/Securing%20Oracle%20BI%20distro.pdf&#34;&gt;Securing Oracle BI Enterprise Edition&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;(There&amp;rsquo;s also PowerPoint files on there including &lt;a href=&#34;ftp://ftp.oracle.com/sales/outgoing/bwise/pnnl/BIEE%20Environmental%20and%20Multi-User%20Development%20Migration.ppt&#34;&gt;this one&lt;/a&gt; but Google doesn&amp;rsquo;t seem to index them)&lt;/p&gt;&#xA;&lt;p&gt;You can use &lt;a href=&#34;http://www.google.co.uk/advanced_search&#34;&gt;Google&amp;rsquo;s Advanced Search page&lt;/a&gt; to build similar queries:&lt;/p&gt;</description>
    </item>
    <item>
      <title>CAF installation video</title>
      <link>https://rmoff.net/2009/11/03/caf-installation-video/</link>
      <pubDate>Tue, 03 Nov 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/11/03/caf-installation-video/</guid>
      <description>&lt;p&gt;Christian Screen has done a nice video explaining the CAF installation, and has promised a deep-dive followup which I&amp;rsquo;m looking forward to. &lt;a href=&#34;http://www.artofbi.com/index.php/2009/11/obiee-content-accelerator-framework-caf-installation&#34;&gt;Click here for the article&lt;/a&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Experts</title>
      <link>https://rmoff.net/2009/11/02/experts/</link>
      <pubDate>Mon, 02 Nov 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/11/02/experts/</guid>
      <description>&lt;p&gt;A &lt;a href=&#34;http://jonathanlewis.wordpress.com/2009/10/18/experts/&#34;&gt;brilliant posting here from Jonathan Lewis&lt;/a&gt; on the subject of Experts. He in turn is quoting Chen Shapira: &amp;ldquo;DBAs are under a lot of pressure not to be experts.&amp;rdquo;. Read the sentence again, as it took me a minute to figure out. He&amp;rsquo;s writing in the context of an Oracle DBA but I think it&amp;rsquo;s equally applicable to those working with an looking after installations of OBIEE et al.&lt;/p&gt;</description>
    </item>
    <item>
      <title>The state of OBIEE on the web</title>
      <link>https://rmoff.net/2009/10/30/the-state-of-obiee-on-the-web/</link>
      <pubDate>Fri, 30 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/30/the-state-of-obiee-on-the-web/</guid>
      <description>&lt;h2 id=&#34;an-advance-footnote&#34;&gt;An advance footnote&lt;/h2&gt;&#xA;&lt;p&gt;I&amp;rsquo;ll start this by saying why I think things are how they are, and then I&amp;rsquo;ll get to the meat of my article.&lt;/p&gt;&#xA;&lt;p&gt;OBIEE in its current incarnation (v10.1.3) is a mature product. All the big bugs have been caught and fixed. All the known quirks are well documented. All the missing features are known. All the clever workarounds have been found. All the neat little hacks have been explored.&lt;/p&gt;</description>
    </item>
    <item>
      <title>The mind boggles...</title>
      <link>https://rmoff.net/2009/10/26/the-mind-boggles/</link>
      <pubDate>Mon, 26 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/26/the-mind-boggles/</guid>
      <description>&lt;p&gt;I honestly don&amp;rsquo;t dare click the &amp;ldquo;Did you mean&amp;rdquo; link ;-)&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/obiee-mude-google-search_1256571930733.png&#34; alt=&#34;obiee MUDE - Google Search_1256571930733&#34; title=&#34;obiee MUDE - Google Search_1256571930733&#34;&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Troubleshooting OBIEE and ORA-12154: TNS:could not resolve the connect identifier</title>
      <link>https://rmoff.net/2009/10/22/troubleshooting-obiee-and-ora-12154-tnscould-not-resolve-the-connect-identifier/</link>
      <pubDate>Thu, 22 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/22/troubleshooting-obiee-and-ora-12154-tnscould-not-resolve-the-connect-identifier/</guid>
      <description>&lt;p&gt;A frequent question on the &lt;a href=&#34;http://forums.oracle.com/forums/forum.jspa?forumID=378&#34;&gt;OTN OBIEE forum&lt;/a&gt; is how to fix this error:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;[nQSError: 17001] Oracle Error code: 12154, message: ORA-12154: TNS:could not resolve the connect identifier specified at OCI call OCIServerAttach.&#xA;[nQSError: 17014] Could not connect to Oracle database.&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;The error is simply OBIEE reporting that it tried to connect from the BI Server to an Oracle database and the Oracle client returned an error. Distilling it down gives us this error:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Critical Patch Update - OBIEE vuln CVE-2009-1990</title>
      <link>https://rmoff.net/2009/10/21/obiee-vuln-cv-2009-1990/</link>
      <pubDate>Wed, 21 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/21/obiee-vuln-cv-2009-1990/</guid>
      <description>&lt;p&gt;October&amp;rsquo;s &lt;a href=&#34;http://www.oracle.com/technology/deploy/security/critical-patch-updates/cpuoct2009.html&#34;&gt;Oracle Critical Patch Update Advisory&lt;/a&gt; has been released. There are two vulnerabilities (CVE-2009-1999, CVE-2009-1990) listed under &lt;strong&gt;Oracle Application Server&lt;/strong&gt; for &amp;ldquo;Component&amp;rdquo; &lt;strong&gt;Business Intelligence Enterprise Edition&lt;/strong&gt; and one (CVE-2009-3407) for &amp;ldquo;component&amp;rdquo; &lt;strong&gt;Portal&lt;/strong&gt;.&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;CVE-2009-1999 is OBIEE and &amp;ldquo;Fixed in all supported versions. No patch provided in this Critical Patch Update.&amp;rdquo;.&lt;/li&gt;&#xA;&lt;li&gt;CVE-2009-3407 looks like only OAS (not OBIEE), up to versions 10.1.2.3 and 10.1.4.2.&lt;/li&gt;&#xA;&lt;li&gt;CVE-2009-1990 is OBIEE and is the main vuln of interest. It&amp;rsquo;s unclear if it&amp;rsquo;s just OBIEE 10.1.3.4.x, or all versions of OBIEE through to and including 10.1.3.4.1. It&amp;rsquo;s also confusing putting it on the same table as OAS especially given it has similar versioning (10.1.3.x.x).&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;For information about patches, see &lt;a href=&#34;http://metalink.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&amp;amp;p_id=881382.1#AS_on_request&#34;&gt;My Oracle Support Note 881382.1&lt;/a&gt;. This doc lists patches &lt;a href=&#34;http://updates.oracle.com/ARULink/PatchDetails/process_form?patch_num=8927890&#34;&gt;8927890&lt;/a&gt; and &lt;a href=&#34;http://updates.oracle.com/ARULink/PatchDetails/process_form?patch_num=8927886&#34;&gt;8927886&lt;/a&gt; for OBIEE 10.1.3.4.1 and 10.1.3.4.0 respectively. Since no other versions are mentioned that suggests it doesn&amp;rsquo;t affect them but that&amp;rsquo;d be a heck of an assumption to make and if I were running &amp;lt; 10.1.3.4.0 I&amp;rsquo;d be raising an SR to seek clarification especially given the ambiguity of the table in the &lt;a href=&#34;http://www.oracle.com/technology/deploy/security/critical-patch-updates/cpuoct2009.html#AppendixOAS&#34;&gt;Advisory doc&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OTN forum - &#34;Pro&#34;</title>
      <link>https://rmoff.net/2009/10/21/otn-forum-pro/</link>
      <pubDate>Wed, 21 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/21/otn-forum-pro/</guid>
      <description>&lt;p&gt;w00t :-D&lt;/p&gt;&#xA;&lt;p&gt;Yesterday I got my &amp;ldquo;Pro&amp;rdquo; medal for 500 points on &lt;a href=&#34;http://forums.oracle.com/forums&#34;&gt;OTN Forums&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;I&amp;rsquo;d been intending to post a grumpy rant about OTN recently, but maybe I&amp;rsquo;ll postpone that for a few days now ;)&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/otn.png&#34; alt=&#34;OTN&#34; title=&#34;OTN&#34;&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>New OBIEE blogs</title>
      <link>https://rmoff.net/2009/10/19/new-obiee-blogs/</link>
      <pubDate>Mon, 19 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/19/new-obiee-blogs/</guid>
      <description>&lt;p&gt;Two new blogs of note :&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://shivbharti.blogspot.com/&#34;&gt;Shiv Bharti&amp;rsquo;s Blog&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://debaatobiee.wordpress.com/&#34;&gt;Debashis&amp;rsquo;s OBIEE Blog&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;Both are well worth a read &amp;amp; following.&lt;/p&gt;</description>
    </item>
    <item>
      <title>BI Server hung - nQSError 14054 / 15001 / 23005</title>
      <link>https://rmoff.net/2009/10/16/bi-server-hung-nqserror-14054-15001-23005/</link>
      <pubDate>Fri, 16 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/16/bi-server-hung-nqserror-14054-15001-23005/</guid>
      <description>&lt;p&gt;Watch out if you are using init blocks in your RPD. We hit a bug (#9019374) recently that caused BI Server (10.1.3.4) to hang.&lt;/p&gt;&#xA;&lt;p&gt;The init block in question should have returned a date to update a repository variable, but because of badly-written SQL and abnormal data in the source table actually returned a &lt;strong&gt;null value&lt;/strong&gt;. BI Server evidently didn&amp;rsquo;t like this null being inserted somewhere where it shouldn&amp;rsquo;t have and understandably logged :&lt;/p&gt;</description>
    </item>
    <item>
      <title>Heads up - Critical Patch Update affecting OBIEE</title>
      <link>https://rmoff.net/2009/10/16/heads-up-critical-patch-update-for-oas-affecting-obiee/</link>
      <pubDate>Fri, 16 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/16/heads-up-critical-patch-update-for-oas-affecting-obiee/</guid>
      <description>&lt;p&gt;The &lt;a href=&#34;http://www.oracle.com/technology/deploy/security/critical-patch-updates/cpuoct2009.html&#34;&gt;Critical Patch Update Pre-Release Announcement&lt;/a&gt; for October has been published. The pre-release is advance notice of the affected software prior to release of the quarterly Critical Patch Update. It is published on the Thursday prior to the patch releases (which was postponed by a week because of OOW).&lt;/p&gt;&#xA;&lt;p&gt;It looks like if you&amp;rsquo;re running OBIEE 10.1.3.4.0 or 10.1.3.4.1 through OAS 10.1.2.3.0/10.1.3.4.0/10.1.3.5.0 then you should check back next Tuesday 20th for details.&lt;/p&gt;</description>
    </item>
    <item>
      <title>New OBIEE benchmark - 50,000 users</title>
      <link>https://rmoff.net/2009/10/12/new-obiee-benchmark-50000-users/</link>
      <pubDate>Mon, 12 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/12/new-obiee-benchmark-50000-users/</guid>
      <description>&lt;p&gt;A new OBIEE benchmark has been published by Oracle. It&amp;rsquo;s on the same hardware as August&amp;rsquo;s benchmark - Sun T5440s. Anyone would think that Oracle like Sun ;-)&lt;/p&gt;&#xA;&lt;p&gt;Details &lt;a href=&#34;http://blogs.sun.com/mandalika/entry/oracle_business_intelligence_10_1&#34;&gt;here&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;Numbers added to my collation post &lt;a href=&#34;https://rmoff.net/2009/09/18/collated-obiee-benchmarks/&#34;&gt;here&lt;/a&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIA 7.9.6.1 released</title>
      <link>https://rmoff.net/2009/10/12/obia-7-9-6-1-released/</link>
      <pubDate>Mon, 12 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/12/obia-7-9-6-1-released/</guid>
      <description>&lt;p&gt;The latest point release of Oracle Business Intelligence Applications, 7.9.6.1, has been released and is available for download &lt;a href=&#34;http://www.oracle.com/technology/software/products/ias/htdocs/101320bi.html&#34;&gt;from here&lt;/a&gt; (&lt;a href=&#34;http://download.oracle.com/otn/nt/bi/biapps_windows_7961.zip&#34;&gt;direct link to download&lt;/a&gt;).&lt;/p&gt;&#xA;&lt;p&gt;The version.txt reports the version as:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;Build: 7.9.6.1.100609.2038 Release Version: Oracle Business Intelligence Applications 7.9.6.1 Package: 100609.2038&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;No updated &lt;a href=&#34;http://www.oracle.com/technology/documentation/bi_apps.html&#34;&gt;documentation library&lt;/a&gt; yet through, so can&amp;rsquo;t nosey through the release notes. The docs that come with the download are labelled 7.9.6 and dated April 09 so don&amp;rsquo;t look like they&amp;rsquo;ve been updated either.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Usage Tracking - only half the story ...</title>
      <link>https://rmoff.net/2009/10/06/usage-tracking-only-half-the-story/</link>
      <pubDate>Tue, 06 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/06/usage-tracking-only-half-the-story/</guid>
      <description>&lt;p&gt;OBIEE comes with a very useful usage tracking feature. For information about it and how to set it up see these links:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://obiee101.blogspot.com/2008/08/obiee-setting-up-usage-tracking.html&#34;&gt;http://obiee101.blogspot.com/2008/08/obiee-setting-up-usage-tracking.html&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://www.oracle.com/technology/obe/obe_bi/bi_ee_1013/usage_tracking/usage_tracking.htm&#34;&gt;http://www.oracle.com/technology/obe/obe_bi/bi_ee_1013/usage_tracking/usage_tracking.htm&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://108obiee.blogspot.com/2009/07/obiee-usage-tracking-setup-and-cloning.html&#34;&gt;http://108obiee.blogspot.com/2009/07/obiee-usage-tracking-setup-and-cloning.html&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;Usage Tracking captures the logical SQL of queries in a column called QUERY_TEXT in the table S_NQ_ACCT. However, out of the box this column is defined as 1k (1024 bytes) long. In some situations this will limit its usefulness because the text will be truncated if necessary when it&amp;rsquo;s inserted.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE and HP Performance Center (a.k.a. LoadRunner) - Notes</title>
      <link>https://rmoff.net/2009/10/01/obiee-and-hp-performance-center-a-k-a-loadrunner-notes/</link>
      <pubDate>Thu, 01 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/01/obiee-and-hp-performance-center-a-k-a-loadrunner-notes/</guid>
      <description>&lt;p&gt;This is a supplemental post to &lt;a href=&#34;https://rmoff.net/2009/10/01/obiee-and-loadrunner-howto/&#34;&gt;this one describing how to set up a VUser in LoadRunner to test OBIEE&lt;/a&gt;. It&amp;rsquo;s various notes that I made during the development but which aren&amp;rsquo;t directly part of the step-by-step tutorial. They&amp;rsquo;re not necessarily vital for recording scripts, but observations and explanations that should be helpful when working with LoadRunner and OBIEE.&lt;/p&gt;&#xA;&lt;h2 id=&#34;validation-using-sawserver-logs&#34;&gt;Validation using sawserver logs&lt;/h2&gt;&#xA;&lt;p&gt;It&amp;rsquo;s no use running a load test if the load you think you&amp;rsquo;re applying isn&amp;rsquo;t actually being applied. To validate the test you compare what happens on the server when the scenario is manually performed with what happens with it&amp;rsquo;s from a VUser and hopefully the same behaviour is observed.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Performance testing OBIEE using HP Performance Center (a.k.a. LoadRunner)</title>
      <link>https://rmoff.net/2009/10/01/obiee-and-loadrunner-howto/</link>
      <pubDate>Thu, 01 Oct 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/10/01/obiee-and-loadrunner-howto/</guid>
      <description>&lt;p&gt;My two earlier posts (&lt;a href=&#34;https://rmoff.net/2009/08/19/obiee-and-load-runner-part-1/&#34;&gt;here&lt;/a&gt; and &lt;a href=&#34;https://rmoff.net/2009/08/21/obiee-and-load-runner-part-2/&#34;&gt;here&lt;/a&gt;) detail the difficulties I had with LoadRunner (now called HP Performance Center). After a bit of a break along with encouragement from knowing that it must be possible because it&amp;rsquo;s how Oracle generates their &lt;a href=&#34;https://rmoff.net/2009/09/18/collated-obiee-benchmarks/&#34;&gt;OBIEE benchmarks&lt;/a&gt; I&amp;rsquo;ve now got something working. I also got a useful doc from Oracle support which outlines pretty much what I&amp;rsquo;ve done here too.&lt;/p&gt;&#xA;&lt;p&gt;In essence what you do - and what the &lt;a href=&#34;https://support.oracle.com/CSP/ui/flash.html#tab=KBHome(page=KBHome&amp;amp;id=()),(page=KBNavigator&amp;amp;id=(bmDocID=496417.1&amp;amp;from=BOOKMARK&amp;amp;bmDocDsrc=KB&amp;amp;viewingMode=1143))&#34;&gt;Metalink document 496417.1&lt;/a&gt; states - is you use the Web (HTTP/HTML) protocol with URL-mode.&lt;/p&gt;</description>
    </item>
    <item>
      <title>James Morle : Spotting the Red Flags (Part 1 of n)</title>
      <link>https://rmoff.net/2009/09/25/james-morle-spotting-the-red-flags-part-1-of-n/</link>
      <pubDate>Fri, 25 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/25/james-morle-spotting-the-red-flags-part-1-of-n/</guid>
      <description>&lt;p&gt;A &lt;a href=&#34;http://jamesmorle.wordpress.com/&#34;&gt;new blog from James Morle&lt;/a&gt;, who I don&amp;rsquo;t know but from other bloggers sounds well respected, and describes himself thus:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;em&gt;Since it’s been nearly ten years since I wrote my book,&lt;/em&gt; &lt;a href=&#34;http://cseng.aw.com/catalog/academic/product/0,1144,0201325748-TOC,00.html&#34;&gt;&lt;em&gt;Scaling Oracle8i&lt;/em&gt;&lt;/a&gt;&lt;em&gt;, I thought it was about time that I started writing again. I thought I would start with the new-fangled blogging thing, and see where it takes me. Here goes.&lt;/em&gt;&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;He&amp;rsquo;s got a really interesting post on &amp;ldquo;red flags&amp;rdquo; to look for in diagnosing performance problems in Oracle: &lt;a href=&#34;http://jamesmorle.wordpress.com/2009/09/24/spotting-the-red-flags-part-1-of-n/&#34;&gt;Spotting the Red Flags (Part 1 of n)&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>SQL Developer v2.1 Early Adopter released</title>
      <link>https://rmoff.net/2009/09/25/sql-developer-v2-1-early-adopter-released/</link>
      <pubDate>Fri, 25 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/25/sql-developer-v2-1-early-adopter-released/</guid>
      <description>&lt;p&gt;SQL Developer v2.1 Early Adopter was released yesterday.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://www.oracle.com/technology/software/products/sql/index21_EA1.html&#34;&gt;Download it here&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://www.oracle.com/technology/products/database/sql_developer/files/NewFeatureList21.htm&#34;&gt;New features list&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://www.oraclenerd.com/2009/09/sql-developer-21-early-adopter-1.html&#34;&gt;Hat-tip&lt;/a&gt; and &lt;a href=&#34;http://jhdba.wordpress.com/&#34;&gt;hat-tip&lt;/a&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Changing password on Oracle 11g from 10g clients (ORA-28001 -&gt; ORA-01017)</title>
      <link>https://rmoff.net/2009/09/23/changing-password-on-oracle-11g-from-10g-clients-ora-28001-ora-01017/</link>
      <pubDate>Wed, 23 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/23/changing-password-on-oracle-11g-from-10g-clients-ora-28001-ora-01017/</guid>
      <description>&lt;p&gt;Bit of an odd one this. Oracle 11g database, a user&amp;rsquo;s password has expired. But when I try to change it, I can&amp;rsquo;t:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#19177c&#34;&gt;$sqlplus&lt;/span&gt; MYUSER/oldPW@oraDBServer&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;SQL*Plus: Release 10.2.0.1.0 - Production on Wed Sep &lt;span style=&#34;color:#666&#34;&gt;23&lt;/span&gt; 07:57:41 &lt;span style=&#34;color:#666&#34;&gt;2009&lt;/span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;Copyright &lt;span style=&#34;color:#666&#34;&gt;(&lt;/span&gt;c&lt;span style=&#34;color:#666&#34;&gt;)&lt;/span&gt; 1982, 2005, Oracle.  All rights reserved.&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;ERROR:&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;ORA-28001: the password has expired&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;Changing password &lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;for&lt;/span&gt; MYUSER&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;New password:&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;Retype new password:&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;ERROR:&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;ORA-01017: invalid username/password; logon denied&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;Password unchanged&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;After a bit of digging around I found &lt;a href=&#34;http://www.experts-exchange.com/Database/Oracle/Q_24264349.html&#34;&gt;a post&lt;/a&gt; (&lt;a href=&#34;http://209.85.229.132/search?q=cache:hqJIemZFzTgJ:www.experts-exchange.com/Database/Oracle/Q_24264349.html+ora+expired+sqlplus+28001+01017&amp;amp;cd=1&amp;amp;hl=en&amp;amp;ct=clnk&#34;&gt;cached&lt;/a&gt;) which says that this is a problem when you use 10g clients with 11g database. And sure enough:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Collated OBIEE benchmarks</title>
      <link>https://rmoff.net/2009/09/18/collated-obiee-benchmarks/</link>
      <pubDate>Fri, 18 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/18/collated-obiee-benchmarks/</guid>
      <description>&lt;p&gt;(Updated 12th Oct 09)&lt;/p&gt;&#xA;&lt;p&gt;Here&amp;rsquo;s a list of the OBIEE benchmark documents published by Oracle:&lt;/p&gt;&#xA;&lt;table class=&#34;inline&#34; border=&#34;0&#34;&gt;&lt;tbody&gt;&lt;tr class=&#34;row0&#34;&gt;&lt;td class=&#34;col0&#34;&gt;&lt;strong&gt;Benchmark&lt;/strong&gt;&lt;/td&gt;&lt;td class=&#34;col1&#34;&gt;&lt;strong&gt;Date&lt;/strong&gt;&lt;/td&gt;&lt;td class=&#34;col2&#34;&gt;&lt;strong&gt;Source document&lt;/strong&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr class=&#34;row0&#34;&gt;&lt;td class=&#34;col0&#34;&gt;1 - IBM System x3755&lt;/td&gt;&lt;td class=&#34;col1&#34;&gt;Sep-07&lt;/td&gt;&lt;td class=&#34;col2&#34;&gt;&lt;a class=&#34;urlextern&#34; title=&#34;http://www.oracle.com/appserver/business-intelligence/docs/bi-suite-ee-4000-benchmark-x3755.pdf&#34; rel=&#34;nofollow&#34; href=&#34;http://www.oracle.com/appserver/business-intelligence/docs/bi-suite-ee-4000-benchmark-x3755.pdf&#34; target=&#34;_ext&#34;&gt;PDF&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr class=&#34;row1&#34;&gt;&lt;td class=&#34;col0&#34;&gt;2 - HP DL380 G4&lt;/td&gt;&lt;td class=&#34;col1&#34;&gt;Sep-07&lt;/td&gt;&lt;td class=&#34;col2&#34;&gt;&lt;a class=&#34;urlextern&#34; title=&#34;http://www.oracle.com/appserver/business-intelligence/docs/oracle-biee-5k-user-benchmark-hpdl3802.pdf&#34; rel=&#34;nofollow&#34; href=&#34;http://www.oracle.com/appserver/business-intelligence/docs/oracle-biee-5k-user-benchmark-hpdl3802.pdf&#34; target=&#34;_ext&#34;&gt;PDF&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr class=&#34;row2&#34;&gt;&lt;td class=&#34;col0&#34;&gt;3 - Sun T2000&lt;/td&gt;&lt;td class=&#34;col1&#34;&gt;Sep-07&lt;/td&gt;&lt;td class=&#34;col2&#34;&gt;&lt;a class=&#34;urlextern&#34; title=&#34;http://www.oracle.com/appserver/business-intelligence/docs/oracle-bi-ee-10k-benchmark-sunt2000.pdf&#34; rel=&#34;nofollow&#34; href=&#34;http://www.oracle.com/appserver/business-intelligence/docs/oracle-bi-ee-10k-benchmark-sunt2000.pdf&#34; target=&#34;_ext&#34;&gt;PDF&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr class=&#34;row3&#34;&gt;&lt;td class=&#34;col0&#34;&gt;4 - Sun SPARC Enterprise T5440&lt;/td&gt;&lt;td class=&#34;col1&#34;&gt;Aug-09&lt;/td&gt;&lt;td class=&#34;col2&#34;&gt;&lt;a class=&#34;urlextern&#34; title=&#34;http://www.oracle.com/appserver/business-intelligence/docs/oraclebiee_28000user_benchmark_on_solaris_t5440.pdf&#34; rel=&#34;nofollow&#34; href=&#34;http://www.oracle.com/appserver/business-intelligence/docs/oraclebiee_28000user_benchmark_on_solaris_t5440.pdf&#34; target=&#34;_ext&#34;&gt;PDF&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr class=&#34;row3&#34;&gt;&lt;td class=&#34;col0&#34;&gt;5 - Sun SPARC Enterprise T5440&lt;/td&gt;&lt;td class=&#34;col1&#34;&gt;Oct-09&lt;/td&gt;&lt;td class=&#34;col2&#34;&gt;&lt;a class=&#34;urlextern&#34; title=&#34;http://www.oracle.com/appserver/business-intelligence/docs/oraclebiee_50000user_benchmark_on_solaris_t5440.pdf&#34; rel=&#34;nofollow&#34; href=&#34;http://www.oracle.com/appserver/business-intelligence/docs/oraclebiee_50000user_benchmark_on_solaris_t5440.pdf&#34; target=&#34;_ext&#34;&gt;PDF&lt;/a&gt;&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&#xA;&lt;p&gt;Collecting the numbers into one table gives this: &lt;a href=&#34;https://rmoff.net/images/2009/09/benchmarks2.webp&#34;&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/benchmarks2.png&#34; alt=&#34;benchmarks&#34; title=&#34;benchmarks&#34;&gt;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;Based on the details in the documents I think these were all against OBIA&amp;rsquo;s Service Analytics schema &amp;amp; dashboards/reports.&lt;/p&gt;&#xA;&lt;p&gt;Interesting to note the side-by-side comparison in benchmark 3 (Sun T2000) of two servers, in one case both running BI and Presentation Services and in the other having the two components separate. It appears to highlight the benefit that clustering provides in making the best use of resources.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE benchmarks</title>
      <link>https://rmoff.net/2009/09/17/obiee-benchmarks/</link>
      <pubDate>Thu, 17 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/17/obiee-benchmarks/</guid>
      <description>&lt;p&gt;Here&amp;rsquo;s a list of the OBIEE benchmark documents published by Oracle:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://www.oracle.com/appserver/business-intelligence/docs/oraclebiee_28000user_benchmark_on_solaris_t5440.pdf&#34;&gt;28,000 User Benchmark on Sun SPARC Enterprise T5440 Server running Solaris 10 [Aug 09]&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://www.oracle.com/appserver/business-intelligence/docs/oracle-bi-ee-10k-benchmark-sunt2000.pdf&#34;&gt;10,000 User Benchmark on Sun T2000 [Sept 07]&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://www.oracle.com/appserver/business-intelligence/docs/oracle-biee-5k-user-benchmark-hpdl3802.pdf&#34;&gt;5,800 User Benchmark on HP DL380 G4 [Sept 07]&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://www.oracle.com/appserver/business-intelligence/docs/bi-suite-ee-4000-benchmark-x3755.pdf&#34;&gt;4,000 User Benchmark on an IBM System x3755 Server running Red Hat Enterprise Linux&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>OBIEE cluster controller failover in action</title>
      <link>https://rmoff.net/2009/09/15/obiee-cluster-controller-failover-in-action/</link>
      <pubDate>Tue, 15 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/15/obiee-cluster-controller-failover-in-action/</guid>
      <description>&lt;p&gt;Production cluster is 2x BI Server and 2x Presentation Services, with a BIG-IP F5 load balancer on the front.&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/1pub1.png&#34; alt=&#34;1pub&#34; title=&#34;1pub&#34;&gt;&lt;/p&gt;&#xA;&lt;h2 id=&#34;symptoms&#34;&gt;Symptoms&lt;/h2&gt;&#xA;&lt;p&gt;Users started reporting slow login times to BI. Our monitoring tool (Openview) reported that &amp;ldquo;BIServer01 may be down. Failed to contact it using ping.&amp;rdquo;. BIServer01 cannot be reached by ping or ssh from Windows network.&lt;/p&gt;&#xA;&lt;h2 id=&#34;diagnostics&#34;&gt;Diagnostics&lt;/h2&gt;&#xA;&lt;p&gt;nqsserver and nqsclustercontroller on BIServer01 was logging these repeated errors:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;[nQSError: 12002] Socket communication error at call=send: (Number=9) Bad file number&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE on Opera 10 / IE 7 / FF 3.5 / Chrome 4</title>
      <link>https://rmoff.net/2009/09/10/obiee-on-opera-10-ie-7-ff-3-5/</link>
      <pubDate>Thu, 10 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/10/obiee-on-opera-10-ie-7-ff-3-5/</guid>
      <description>&lt;p&gt;A new version of the web browser &lt;a href=&#34;http://www.opera.com/&#34;&gt;Opera&lt;/a&gt; was released recently. Several years ago I used Opera and may even have paid for it IIRC. Then Firefox came along, and the &amp;ldquo;it&amp;rsquo;s not IE&amp;rdquo; excuse was lost for Opera. Not that I mind IE too much nowadays but at the time it was atrocious. Since then I&amp;rsquo;ve revisited Opera each time a new release has come out, but nothing has impressed me enough to ditch Firefox (and nowadays Google Chrome).&lt;/p&gt;</description>
    </item>
    <item>
      <title>Syntax for AdminTool.exe command line script</title>
      <link>https://rmoff.net/2009/09/09/syntax-for-admintool-exe-command-line-script/</link>
      <pubDate>Wed, 09 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/09/syntax-for-admintool-exe-command-line-script/</guid>
      <description>&lt;p&gt;Bringing together in one place all of the script syntax that I&amp;rsquo;ve found so far for using with OBIEE&amp;rsquo;s &lt;strong&gt;AdminTool.exe /command&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;p&gt;Details and examples on usage in the following blogs (where I compiled the commands from):&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://oraclebizint.wordpress.com/2008/05/02/oracle-bi-ee-101332-automating-password-updates-of-connection-pools-and-users-command-line-options/&#34;&gt;Venkat&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://translate.google.co.uk/translate?hl=en&amp;amp;sl=cs&amp;amp;u=http://bidwcz.blogspot.com/2008_05_01_archive.html&amp;amp;ei=KkqmSsbuNcfajQen74W6Dg&amp;amp;sa=X&amp;amp;oi=translate&amp;amp;resnum=6&amp;amp;ct=result&amp;amp;prev=/search%3Fq%3D%2522admintool.exe%2B/command%2522%26hl%3Den%26client%3Dfirefox-a%26rls%3Dorg.mozilla:en-GB:official%26hs%3DEKf%26num%3D30&#34;&gt;Erik Eckhardt&lt;/a&gt; (translated from Czech, &lt;a href=&#34;http://bidwcz.blogspot.com/2008/05/bi-administration-tool-v-pkazovm-mdu.html&#34;&gt;original here&lt;/a&gt;)&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://siebel-essentials.blogspot.com/2008/11/automating-rpd-metadata-export-with.html&#34;&gt;@lex&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://obieeblog.wordpress.com/2009/08/04/simplifying-migration-process-%E2%80%93-changing-environment-specific-variables-in-rpd/&#34;&gt;Kumar Kambam&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;&lt;strong&gt;DON&amp;rsquo;T TRY THIS AT HOME!&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;I would only recommend this for read-only purposes such as generating the metadata dictionary or consistency check.&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;* OpenOnline DSN [user [password]] - Opens the online repository. NB can&amp;rsquo;t edit properties without checking out objects first, and no way to do that from script.&lt;/p&gt;</description>
    </item>
    <item>
      <title>AdminTool.exe /command</title>
      <link>https://rmoff.net/2009/09/08/admintool-exe-command/</link>
      <pubDate>Tue, 08 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/08/admintool-exe-command/</guid>
      <description>&lt;p&gt;There&amp;rsquo;s an undocumented feature in AdminTool.exe that you can use the /command switch with a text file containing scripted commands to make changes to an RPD file (or create a new one).&lt;/p&gt;&#xA;&lt;p&gt;It&amp;rsquo;s undocumented and &lt;strong&gt;UNSUPPORTED&lt;/strong&gt; so be careful using it.&lt;/p&gt;&#xA;&lt;p&gt;Some good details in these blog posts, especially Erik&amp;rsquo;s which has a good list of syntax.&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://oraclebizint.wordpress.com/2008/05/02/oracle-bi-ee-101332-automating-password-updates-of-connection-pools-and-users-command-line-options/&#34;&gt;Venkat&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://translate.google.co.uk/translate?hl=en&amp;amp;sl=cs&amp;amp;u=http://bidwcz.blogspot.com/2008_05_01_archive.html&amp;amp;ei=KkqmSsbuNcfajQen74W6Dg&amp;amp;sa=X&amp;amp;oi=translate&amp;amp;resnum=6&amp;amp;ct=result&amp;amp;prev=/search%3Fq%3D%2522admintool.exe%2B/command%2522%26hl%3Den%26client%3Dfirefox-a%26rls%3Dorg.mozilla:en-GB:official%26hs%3DEKf%26num%3D30&#34;&gt;Erik Eckhardt&lt;/a&gt; (translated from Czech, &lt;a href=&#34;http://bidwcz.blogspot.com/2008/05/bi-administration-tool-v-pkazovm-mdu.html&#34;&gt;original here&lt;/a&gt;)&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://siebel-essentials.blogspot.com/2008/11/automating-rpd-metadata-export-with.html&#34;&gt;@lex&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://obieeblog.wordpress.com/2009/08/04/simplifying-migration-process-%E2%80%93-changing-environment-specific-variables-in-rpd/&#34;&gt;Kumar Kambam&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;I&amp;rsquo;m intrigued to know how the original posters figured out the commands available, if it&amp;rsquo;s undocumented&amp;hellip; :)&lt;/p&gt;</description>
    </item>
    <item>
      <title>Metalink 3 followup</title>
      <link>https://rmoff.net/2009/09/04/metalink-3-followup/</link>
      <pubDate>Fri, 04 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/04/metalink-3-followup/</guid>
      <description>&lt;p&gt;Kudos to the &lt;a href=&#34;http://blogs.oracle.com/supportportal&#34;&gt;My Oracle Support blog&lt;/a&gt; for taking the time to respond to my to my comment about searching for Metalink 3 SRs throwing an error.&lt;/p&gt;&#xA;&lt;p&gt;In essence, if you previously used Metalink 3 you must use &lt;a href=&#34;https://support.oracle.com&#34;&gt;https://support.oracle.com&lt;/a&gt;. If you use &lt;a href=&#34;https://metalink.oracle.com/&#34;&gt;https://metalink.oracle.com/&lt;/a&gt; then you&amp;rsquo;ll hit the problems I did.&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;ML3 SR&amp;rsquo;s are not supported on &lt;a href=&#34;https://metalink.oracle.com/CSP/ui/index.html&#34;&gt;https://metalink.oracle.com/CSP/ui/index.html&lt;/a&gt;. This front end is used for support on legacy server technology, middleware including BEA, and EBusinessSuite&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://blogs.oracle.com/supportportal/2009/09/welcome_to_my_oracle_support_-.html#comment-153658&#34;&gt;Full details here&lt;/a&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>RSS feeds for OBIEE, including del.icio.us obiee tags</title>
      <link>https://rmoff.net/2009/09/04/rss-feeds-for-obiee-including-del-icio-us-obiee-tags/</link>
      <pubDate>Fri, 04 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/04/rss-feeds-for-obiee-including-del-icio-us-obiee-tags/</guid>
      <description>&lt;p&gt;Another way of keeping up with what&amp;rsquo;s going on in the obiee world, add &lt;a href=&#34;http://feeds.delicious.com/v2/rss/tag/obiee?count=15&#34;&gt;this RSS feed&lt;/a&gt; of del.icio.us obiee tags to your reader. It may be less &amp;ldquo;current&amp;rdquo; (because people might discover and bookmark &amp;lsquo;old&amp;rsquo; pages), but it&amp;rsquo;s another tool in the armoury :)&lt;/p&gt;&#xA;&lt;p&gt;If you want an aggregated RSS feed of OBIEE / Oracle related blog postings &lt;a href=&#34;http://www.google.com/reader/public/atom/user%2F13728252801323748997%2Flabel%2FOracle&#34;&gt;try this one&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>CAF troubles</title>
      <link>https://rmoff.net/2009/09/03/caf-troubles/</link>
      <pubDate>Thu, 03 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/03/caf-troubles/</guid>
      <description>&lt;p&gt;Following the &lt;a href=&#34;http://www.oracle.com/technology/obe/obe_bi/bi_ee_1013/caf/caf.html&#34;&gt;Oracle CAF tutorial&lt;/a&gt; here, I got to Cloning Answers Requests section and then got stuck. I&amp;rsquo;d set up my environment exactly the same as in the tutorial, down to the same paths etc. After firing up the CAF to clone requests from the SampleSales catalog: &lt;img src=&#34;https://rmoff.net/images/rnm1978/caf1.png&#34; alt=&#34;caf1&#34; title=&#34;caf1&#34;&gt; I clicked on Next and got the error &lt;strong&gt;&amp;ldquo;Exception occurred when while initializing repository!!!&amp;rdquo;&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/error.png&#34; alt=&#34;error&#34; title=&#34;error&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;By playing around with the passwords and path names I determined that both RPD files existed and that CAF could load them enough to validate the passwords. If the password is incorrect you get the error &amp;ldquo;The repository C:\CAF_Training\Source\samplesales.rpd cannot be opened&amp;rdquo;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Metalink 3 RIP</title>
      <link>https://rmoff.net/2009/09/02/metalink-3-rip/</link>
      <pubDate>Wed, 02 Sep 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/09/02/metalink-3-rip/</guid>
      <description>&lt;p&gt;This weekend just gone Metalink3 went to the digital dustbin. In principle this is a Good Thing, as multiple support websites for a single company is confusing and frustrating.&lt;/p&gt;&#xA;&lt;p&gt;Metalink is now &amp;ldquo;My Oracle Support&amp;rdquo; and is a flash-based whizz-bang affair. Everyone has different tastes, but there&amp;rsquo;s a lot to be said for plain HTML for ease and speed of access. But then people probably grumbled to the Wright Brothers that there was nothing wrong with land-transport at the time&amp;hellip;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Multiple RPDs on one server - Part 1 - the BI Server</title>
      <link>https://rmoff.net/2009/08/25/multiple-rpds-on-one-server-part-1-the-bi-server/</link>
      <pubDate>Tue, 25 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/25/multiple-rpds-on-one-server-part-1-the-bi-server/</guid>
      <description>&lt;h2 id=&#34;introduction&#34;&gt;Introduction&lt;/h2&gt;&#xA;&lt;p&gt;In this article I plan to get samplesales and paint repositories hosted on a single server, using one BI Server instance and two Presentation Services instances. This is on both Unix (OEL 4) and Windows, and both OC4J (OBIEE&amp;rsquo;s &amp;ldquo;basic installation&amp;rdquo; option) and OAS (&amp;ldquo;Advanced Installation&amp;rdquo;).&lt;/p&gt;&#xA;&lt;p&gt;Both samplesales and paint are shipped with 10.1.3.4 of OBIEE, you&amp;rsquo;ll find them in $OracleBI/OracleBI/server/Sample. This article assumes you&amp;rsquo;ve got the RPD of each into $OracleBI/OracleBI/server/Repository and unpacked the web cats for each into $OracleBIdata/web/catalog. It also assumes that you know your way around the architecture of BI and are familiar with NQSConfig.ini and instanceconfig.xml - if neither of those files mean anything to you then you will find some &lt;a href=&#34;http://obiee101.blogspot.com/2009/07/obiee-how-to-get-started.html&#34;&gt;background reading&lt;/a&gt; useful.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Multiple RPDs on one server - Part 2 - Presentation Services</title>
      <link>https://rmoff.net/2009/08/25/multiple-rpds-on-one-server-part-2-presentation-services/</link>
      <pubDate>Tue, 25 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/25/multiple-rpds-on-one-server-part-2-presentation-services/</guid>
      <description>&lt;h2 id=&#34;introduction&#34;&gt;Introduction&lt;/h2&gt;&#xA;&lt;p&gt;In this article I plan to get sample and paint repositories hosted on a single server, using one BI Server instance and two Presentation Services instances. This is on both Unix (OEL 4) and Windows, and both OC4J (OBIEE&amp;rsquo;s &amp;ldquo;basic installation&amp;rdquo; option) and OAS (&amp;ldquo;Advanced Installation&amp;rdquo;).&lt;/p&gt;&#xA;&lt;p&gt;Make sure you&amp;rsquo;ve read and followed &lt;a href=&#34;https://rmoff.net/2009/08/25/multiple-rpds-on-one-server-part-1-the-bi-server/&#34;&gt;part 1 - BI Server&lt;/a&gt; first.&lt;/p&gt;&#xA;&lt;p&gt;Remember that multiple Presentation Services instances on a machine is &lt;strong&gt;UNSUPPORTED BY ORACLE!&lt;/strong&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE error/message code reference</title>
      <link>https://rmoff.net/2009/08/24/obiee-errormessage-code-reference/</link>
      <pubDate>Mon, 24 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/24/obiee-errormessage-code-reference/</guid>
      <description>&lt;p&gt;For some reason Oracle haven&amp;rsquo;t put out a 10.x version of the error &amp;amp; message codes reference guide for OBIEE, but the previous version for Siebel Analytics is still useful:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;PDF version: &lt;a href=&#34;http://download.oracle.com/otndocs/products/bi/bi-ee/docs/784/AnyMsg.pdf&#34;&gt;http://download.oracle.com/otndocs/products/bi/bi-ee/docs/784/AnyMsg.pdf&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;HTML version: &lt;a href=&#34;http://download.oracle.com/docs/cd/E12103_01/books/AnyMsg/AnyMsg_Messages.html#wp1007961&#34;&gt;http://download.oracle.com/docs/cd/E12103_01/books/AnyMsg/AnyMsg_Messages.html#wp1007961&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>Tech Support Cheat Sheet [xkcd.com]</title>
      <link>https://rmoff.net/2009/08/24/tech-support-cheat-sheet-xkcd-com/</link>
      <pubDate>Mon, 24 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/24/tech-support-cheat-sheet-xkcd-com/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;http://www.xkcd.com&#34;&gt;xkcd.com&lt;/a&gt; is one of my favourite comics on the web. It strikes just the right balance of geeky witty humour without being too smart-ass.&lt;/p&gt;&#xA;&lt;p&gt;I liked this recent one a lot :-)&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/tech_support_cheat_sheet.png&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;A couple of other favourites:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://xkcd.com/149/&#34;&gt;http://xkcd.com/149/&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://xkcd.com/327/&#34;&gt;http://xkcd.com/327/&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://xkcd.com/123/&#34;&gt;http://xkcd.com/123/&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>OBIEE and Load Runner - part 2</title>
      <link>https://rmoff.net/2009/08/21/obiee-and-load-runner-part-2/</link>
      <pubDate>Fri, 21 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/21/obiee-and-load-runner-part-2/</guid>
      <description>&lt;p&gt;&lt;strong&gt;UPDATED: See a HOWTO for OBIEE and LoadRunner here: &lt;a href=&#34;https://rmoff.net/2009/10/01/obiee-and-loadrunner-howto/&#34;&gt;/2009/10/01/obiee-and-loadrunner-howto/&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;This is following on from &lt;a href=&#34;https://rmoff.net/2009/08/19/obiee-and-load-runner-part-1/&#34;&gt;my first post about OBIEE and LoadRunner&lt;/a&gt;, in which I failed dismally to get a simple session replaying.&lt;/p&gt;&#xA;&lt;p&gt;In a nutshell where I&amp;rsquo;d got to was using the &amp;ldquo;Web (Click and Script)&amp;rdquo; function which worked fine for logging in but when running a report resulted in an error on the rendered page. Digging around showed the error was from the javascript of the OBIEE front end.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Querying SQL Server from OBIEE running on Unix</title>
      <link>https://rmoff.net/2009/08/21/querying-sql-server-from-obiee-running-on-unix/</link>
      <pubDate>Fri, 21 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/21/querying-sql-server-from-obiee-running-on-unix/</guid>
      <description>&lt;p&gt;A question that pops up on the &lt;a href=&#34;http://forums.oracle.com/forums/adfAuthentication?success_url=https://rmoff.net/forum.jspa?forumID=378&#34;&gt;OBIEE OTN forum&lt;/a&gt; quite often is how to use non-Oracle databases like MS SQL Server when the OBIEE server is running on a non-Windows OS such as Linux.&lt;/p&gt;&#xA;&lt;p&gt;The answer in a nutshell is that since version 10.1.3.3.1 OBIEE has been bundled with ODBC drivers for unix/linux from a company called DataDirect. See the &lt;a href=&#34;http://download.oracle.com/docs/cd/E10415_01/doc/bi.1013/e10416/general_101331.htm#BABDHJAG&#34;&gt;release notes here&lt;/a&gt; for more information and installation instructions (as well as a list of support databases).&lt;/p&gt;</description>
    </item>
    <item>
      <title>Do you mean (pt II)</title>
      <link>https://rmoff.net/2009/08/20/do-you-mean-pt-ii/</link>
      <pubDate>Thu, 20 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/20/do-you-mean-pt-ii/</guid>
      <description>&lt;p&gt;A follow up to &lt;a href=&#34;https://rmoff.net/2009/07/24/metalink-3-do-you-mean/&#34;&gt;my previous post about&lt;/a&gt; Metalink&amp;rsquo;s &amp;ldquo;Do you mean&amp;rdquo; feature, this one made me laugh: &lt;img src=&#34;https://rmoff.net/images/rnm1978/didyoumean.png&#34; alt=&#34;didyoumean&#34; title=&#34;didyoumean&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;I shall miss this kind of thing when Metalink3 merges into My Oracle Support&amp;hellip;.&lt;/p&gt;&#xA;&lt;p&gt;Meep meep!&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/road-runner-1.jpg&#34; alt=&#34;road-runner-1&#34; title=&#34;road-runner-1&#34;&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Logging specific types of sawserver activity</title>
      <link>https://rmoff.net/2009/08/20/logging-specific-types-of-sawserver-activity/</link>
      <pubDate>Thu, 20 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/20/logging-specific-types-of-sawserver-activity/</guid>
      <description>&lt;p&gt;As well as tinkering with the sawserver (Presentation Services) &lt;a href=&#34;https://rmoff.net/2009/07/23/sawserver-logging-configuration-logconfig-xml/&#34;&gt;logging level&lt;/a&gt; and &lt;a href=&#34;https://rmoff.net/2009/08/19/sawserver-log-short-format/&#34;&gt;format&lt;/a&gt;, we can specific which bits of the log we&amp;rsquo;re interested in. This is useful for two reasons:&lt;/p&gt;&#xA;&lt;ol&gt;&#xA;&lt;li&gt;We can enable detailed logging for a specific area, without impacting performance as much as detailed logging throughout would cause&lt;/li&gt;&#xA;&lt;li&gt;By only logging in detail the area of interest we can more easily read the log output and not have to wade through pages of irrelevant information&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;p&gt;Chapter 9 (“Using the Oracle BI Presentation Services Logging Facility”) of the &lt;a href=&#34;http://download.oracle.com/docs/cd/E10415_01/doc/bi.1013/b31766.pdf&#34;&gt;Presentation Services Administration Guide&lt;/a&gt; details the log configuration.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE and Load Runner - part 1</title>
      <link>https://rmoff.net/2009/08/19/obiee-and-load-runner-part-1/</link>
      <pubDate>Wed, 19 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/19/obiee-and-load-runner-part-1/</guid>
      <description>&lt;p&gt;&lt;strong&gt;UPDATED: See a HOWTO for OBIEE and LoadRunner &lt;a href=&#34;https://rmoff.net/2009/10/01/obiee-and-loadrunner-howto/&#34;&gt;here&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;h2 id=&#34;introduction&#34;&gt;Introduction&lt;/h2&gt;&#xA;&lt;p&gt;LoadRunner is a tool from HP (bought from Mercury) that can be used to simulate user activity. It supports a whole host of protocols but for OBIEE I&amp;rsquo;m obviously using the Web one.&lt;/p&gt;&#xA;&lt;p&gt;There are two flavours, &amp;ldquo;Web (Click and Script)&amp;rdquo; and &amp;ldquo;Web (HTTP/HTML)&amp;rdquo;. The latter simply shoves HTTP requests at the server, whereas &amp;ldquo;Click and Script&amp;rdquo; simulates mouse and keyboard entry and thus is more appropriate for this user-based application. [edit]&lt;em&gt;I&amp;rsquo;m not sure if this is actually the case&lt;/em&gt;[/edit]&lt;/p&gt;</description>
    </item>
    <item>
      <title>sawserver log - short format</title>
      <link>https://rmoff.net/2009/08/19/sawserver-log-short-format/</link>
      <pubDate>Wed, 19 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/19/sawserver-log-short-format/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://rmoff.net/2009/07/23/sawserver-logging-configuration-logconfig-xml/&#34;&gt;I posted a while ago&lt;/a&gt; about the sawserver (Presentation Services) log configuration file. Today I&amp;rsquo;m doing some work digging around why sawserver&amp;rsquo;s throwing an error and so increased the log detail. This parameter is really helpful to use:&lt;/p&gt;&#xA;&lt;p&gt;&lt;strong&gt;fmtName=&amp;ldquo;short&amp;rdquo;&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;p&gt;Consider in these two screenshots, the first is with the default log format and shows about six entries. The second is short log format and is about ten times as much data.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OTN forums - different URL to get prompted to login less often</title>
      <link>https://rmoff.net/2009/08/18/otn-forums-different-url-to-get-prompted-to-login-less-often/</link>
      <pubDate>Tue, 18 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/18/otn-forums-different-url-to-get-prompted-to-login-less-often/</guid>
      <description>&lt;p&gt;I have a couple of OTN forums bookmarked, and found that generally every few hours I get signed out and end up viewing them as Guest. I then have to click on sign in, enter password, etc.&lt;/p&gt;&#xA;&lt;p&gt;(Signing out this frequently is ridiculous, IMHO).&lt;/p&gt;&#xA;&lt;p&gt;I found that instead of using the direct URL of a forum, eg: &lt;strong&gt;&lt;a href=&#34;http://forums.oracle.com/forums/forum.jspa?forumID=378&#34;&gt;http://forums.oracle.com/forums/forum.jspa?forumID=378&lt;/a&gt;&lt;/strong&gt; if I use this form: &lt;strong&gt;&lt;a href=&#34;http://forums.oracle.com/forums/adfAuthentication?success&#34;&gt;http://forums.oracle.com/forums/adfAuthentication?success&lt;/a&gt;_url=https://rmoff.net/forum.jspa?forumID=378&lt;/strong&gt; then I end up signed in more often, and if I&amp;rsquo;ve been signed out then it goes straight to the login page and then through to the forum.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Unix script to report on OBIEE and OBIA processes state</title>
      <link>https://rmoff.net/2009/08/14/unix-script-to-report-on-obiee-and-obia-processes-state/</link>
      <pubDate>Fri, 14 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/14/unix-script-to-report-on-obiee-and-obia-processes-state/</guid>
      <description>&lt;p&gt;Here&amp;rsquo;s a set of scripts that I use on our servers as a quick way to check if the various BI components are up and running.&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/areservicesrunning2.png&#34; alt=&#34;areservicesrunning&#34; title=&#34;areservicesrunning&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;Because we split the stack across servers, there are different scripts called in combination. On our dev boxes we have everything and so the script calls all three sub-scripts, whereas on Production each server will run one of:&lt;/p&gt;&#xA;&lt;ol&gt;&#xA;&lt;li&gt;BI Server&lt;/li&gt;&#xA;&lt;li&gt;Presentation Server &amp;amp; OAS&lt;/li&gt;&#xA;&lt;li&gt;Informatica &amp;amp; DAC&lt;/li&gt;&#xA;&lt;/ol&gt;&#xA;&lt;p&gt;The scripts source another script called process_check.sh which I based on the common.sh script that comes with OBIEE.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIA upgrade 7.9.5 to 7.9.6 - first thoughts</title>
      <link>https://rmoff.net/2009/08/13/obia-upgrade-7-9-5-to-7-9-6-first-thoughts/</link>
      <pubDate>Thu, 13 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/13/obia-upgrade-7-9-5-to-7-9-6-first-thoughts/</guid>
      <description>&lt;p&gt;We&amp;rsquo;re upgrading from OBIA 7.9.5 (Financials - GL) to OBIA 7.9.6. Our reasons are for support (7.9.5 does not support Oracle 11g) and minor functionality additions.&lt;/p&gt;&#xA;&lt;p&gt;Our architecture is: HP-UX 64 bit Itanium (11.31), Oracle 11g (11.1.0.7), separate ETL server, 4x OBIEE servers (2x BI, 2xPS). We have no customisations in the ETL except something for budgets, which is superseded in 7.9.6.&lt;/p&gt;&#xA;&lt;p&gt;This post is a semi-formed articulation of my frustrations encountered during an initial run through of the upgrade in a sandbox. As we progress with the upgrade I will post further, hopefully more useful, information on what we encounter.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Repository Error ([REP_51821] Failed to connect from Integration Service (pmserver) to repository Oracle_BI_DW_Base running in exclusive mode.)</title>
      <link>https://rmoff.net/2009/08/10/repository-error-rep_51821-failed-to-connect-from-integration-service-pmserver-to-repository-oracle_bi_dw_base-running-in-exclusive-mode/</link>
      <pubDate>Mon, 10 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/10/repository-error-rep_51821-failed-to-connect-from-integration-service-pmserver-to-repository-oracle_bi_dw_base-running-in-exclusive-mode/</guid>
      <description>&lt;p&gt;I keep hitting this error when setting up OBIA. I suppose it&amp;rsquo;s what it says on the tin, but Googling it didn&amp;rsquo;t match so I&amp;rsquo;m posting this so next time I hit it I remember :-)&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;Repository Error ([REP_51821] Failed to connect from Integration Service (pmserver) to repository Oracle_BI_DW_Base running in exclusive mode.)&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;The cause is the Repository Service having OperatingMode set to Exclusive. This is necessary for some of the setup operations like restoring the pre-built Repository, but if you forget to switch it back the Integration Service will suddenly stop working.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Clean install of OAS - Enterprise Manager not available</title>
      <link>https://rmoff.net/2009/08/06/clean-install-of-oas-enterprise-manager-not-available/</link>
      <pubDate>Thu, 06 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/06/clean-install-of-oas-enterprise-manager-not-available/</guid>
      <description>&lt;p&gt;I successfully installed OAS 10.1.3.3 and patched to 10.1.3.4. http://localhost:7777 gave the OAS welcome page, but going to http://localhost:7777/em gave 404 Not Found.&lt;/p&gt;&#xA;&lt;p&gt;In [OASHome]/j2ee/home/config/servers.xml search for ascontrol, you should get:&lt;/p&gt;&#xA;&lt;application name=&#34;ascontrol&#34; path=&#34;../../home/applications/ascontrol.ear&#34; parent=&#34;system&#34; start=&#34;false&#34; /&gt;&#xA;&lt;p&gt;change the start attribute to true&lt;/p&gt;&#xA;&lt;application name=&#34;ascontrol&#34; path=&#34;../../home/applications/ascontrol.ear&#34; parent=&#34;system&#34; start=&#34;true&#34; /&gt;&#xA;&lt;p&gt;Restart OAS ([OAShome]/opmn/bin/opmnctl restartproc) and Enterprise Manager should now be available&lt;/p&gt;</description>
    </item>
    <item>
      <title>Google vs Bing</title>
      <link>https://rmoff.net/2009/08/05/google-vs-bing/</link>
      <pubDate>Wed, 05 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/05/google-vs-bing/</guid>
      <description>&lt;p&gt;There&amp;rsquo;s been a bit of hype about Bing recently, so I thought I&amp;rsquo;d try it out in trying to get to the bottom of &lt;a href=&#34;http://forums.oracle.com/forums/thread.jspa?messageID=3668647&#34;&gt;this question&lt;/a&gt; on the OBIEE forum.&lt;/p&gt;&#xA;&lt;p&gt;The question was around the nqschangepassword utility and the error it&amp;rsquo;s reporting: &lt;strong&gt;nQSError: 46090 The odbc.ini file could not found or could not be accessed.&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;p&gt;I did a google for the error to see what other issues could cause the error. &lt;a href=&#34;http://www.google.co.uk/search?q=nQSError+46090&#34;&gt;Google showed up&lt;/a&gt; the forum posting in question, plus the Siebel 794 message reference PDF (no v10 reference guide yet :( ) (in blogging this I&amp;rsquo;ll now probably show up on the list too!)&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIA grumble</title>
      <link>https://rmoff.net/2009/08/04/obia-grumble/</link>
      <pubDate>Tue, 04 Aug 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/08/04/obia-grumble/</guid>
      <description>&lt;p&gt;I&amp;rsquo;m starting on an upgrade from OBIA 7.9.5 to 7.9.6 and wading through the two main docs:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://download.oracle.com/docs/cd/E14223_01/bia.796/e14218.pdf&#34;&gt;7.9.6 Upgrade guide&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://download.oracle.com/docs/cd/E14223_01/bia.796/e14217.pdf&#34;&gt;7.9.6 Installation guide&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;It would be nice if Oracle could come up with some less confusing terminology. It seems that not only is the whole product of OBIA referred to as OBIA (&lt;a href=&#34;http://siebel-essentials.blogspot.com/2009/06/can-you-describe-oracle-bi-applications.html&#34;&gt;see @lex&amp;rsquo;s posting for a good explanation&lt;/a&gt;), but that the sub-components which are not-OBIEE-or-DAC-or-Informatica is also OBIA, c.f. page 6-1 of the Upgrade guide &amp;ldquo;[&amp;hellip;]upgrade your Oracle BI Applications environment to the current version.&amp;rdquo; To me that implies that once I&amp;rsquo;ve done this, my OBIA will be upgraded - but no, actually, some of the supporting bits will be upgraded, but I still have to do a heck of a lot more grunt work before what I consider OBIA (and the manual is called OBIA too!) is upgraded.&lt;/p&gt;</description>
    </item>
    <item>
      <title>What is OBIA...</title>
      <link>https://rmoff.net/2009/07/30/what-is-obia/</link>
      <pubDate>Thu, 30 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/30/what-is-obia/</guid>
      <description>&lt;p&gt;Very good post by @lex giving a nice clear explanation of what OBIA (Oracle Business Intelligence Applications) &lt;strong&gt;is&lt;/strong&gt;&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://siebel-essentials.blogspot.com/2009/06/can-you-describe-oracle-bi-applications.html&#34;&gt;http://siebel-essentials.blogspot.com/2009/06/can-you-describe-oracle-bi-applications.html&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;This should be made a sticky on the &lt;a href=&#34;http://forums.oracle.com/forums/forum.jspa?forumID=410&amp;amp;start=0&#34;&gt;OBIA forum&lt;/a&gt; in my opinion.&lt;/p&gt;&#xA;&lt;p&gt;It&amp;rsquo;s clear from &lt;a href=&#34;http://forums.oracle.com/forums/thread.jspa?threadID=936497&amp;amp;tstart=0&#34;&gt;postings on the forum&lt;/a&gt; that an awful lot of people don&amp;rsquo;t understand what OBIA is or how it sits with OBIEE. I even attended a course last week in which the &lt;strong&gt;Oracle&lt;/strong&gt; trainer stated that OBIEE included ETL and DW schemas, and stuck to this when challenged.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE performance monitoring and alerting with jManage</title>
      <link>https://rmoff.net/2009/07/29/oracle-bi-management-jmanage/</link>
      <pubDate>Wed, 29 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/29/oracle-bi-management-jmanage/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;https://rmoff.net/2009/07/22/oracle-bi-management-systems-management-mbeans/&#34;&gt;OBIEE&amp;rsquo;s Systems Management&lt;/a&gt; component exposes configuration and performance data through &lt;a href=&#34;http://java.sun.com/j2se/1.5.0/docs/guide/management/overview.html#mbeans&#34;&gt;Java MBeans&lt;/a&gt;. As discussed in other posts these can be be accessed through several different ways:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2009/07/16/jconsole-jmx/&#34;&gt;JConsole&lt;/a&gt; (see also &lt;a href=&#34;https://rmoff.net/2009/07/21/jconsole-jmx-followup/&#34;&gt;here&lt;/a&gt;)&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://blogs.oracle.com/siebelessentials/2008/11/oracle_bi_ee_and_mbeans.html&#34;&gt;oc4j&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;https://rmoff.net/2009/07/24/obiee-windows-perfmon-counters/&#34;&gt;Windows PerfMon&lt;/a&gt; (although I guess this isn&amp;rsquo;t actually using MBeans/JMX?)&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://obiee101.blogspot.com/2009/07/obiee-perfmon-performance-monitor.html&#34;&gt;saw.dll?perfmon&lt;/a&gt;&lt;/li&gt;&#xA;&lt;li&gt;&lt;a href=&#34;http://www.oracle.com/technology/pub/articles/rittman-oem-bipack.html&#34;&gt;BI Management Pack&lt;/a&gt;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;Since it&amp;rsquo;s a standard java technology being used we can in theory use anything that is designed for monitoring mbeans via jmx. Doing some Googling I discovered jManage.&lt;/p&gt;</description>
    </item>
    <item>
      <title>How to find out what web application server is in use</title>
      <link>https://rmoff.net/2009/07/28/how-to-find-out-what-web-application-server-is-in-use/</link>
      <pubDate>Tue, 28 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/28/how-to-find-out-what-web-application-server-is-in-use/</guid>
      <description>&lt;p&gt;If, for some reason, you need to check what web application server is in use for Presentation Services (as &lt;a href=&#34;http://forums.oracle.com/forums/thread.jspa?messageID=3651833#3651833&#34;&gt;this chap&lt;/a&gt; needed to), you can use an add-in for FireFox called &lt;a href=&#34;https://addons.mozilla.org/en-US/firefox/addon/6647&#34;&gt;HttpFox&lt;/a&gt; to inspect the HTTP headers.&lt;/p&gt;&#xA;&lt;p&gt;1. Install &lt;a href=&#34;https://addons.mozilla.org/en-US/firefox/addon/6647&#34;&gt;HttpFox&lt;/a&gt; (and obviously Firefox if you don&amp;rsquo;t have it already!)&lt;br&gt;&#xA;2. Open the HttpFox window (Tools -&amp;gt; HttpFox -&amp;gt; Toggle HttpFox)&lt;br&gt;&#xA;3. Click the Start button in the HttpFox window&lt;br&gt;&#xA;4. Navigate to your OBIEE home page&lt;br&gt;&#xA;5. Click the Stop button in the HttpFox window&lt;br&gt;&#xA;6. Click on the first entry in the list, URL should be http://yourserver:7777/analytics/saw.dll?Dashboard&lt;br&gt;&#xA;7. In the right-hand pane of the Headers tab you should see Server listed. In this instance, it&amp;rsquo;s Oracle-Application-Server-10g/10.1.3.1.0 Oracle-HTTP-Server&lt;/p&gt;</description>
    </item>
    <item>
      <title>Maker&#39;s Schedule, Manager&#39;s Schedule</title>
      <link>https://rmoff.net/2009/07/28/makers-schedule-managers-schedule/</link>
      <pubDate>Tue, 28 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/28/makers-schedule-managers-schedule/</guid>
      <description>&lt;p&gt;I found this post very interesting: &lt;a href=&#34;http://www.paulgraham.com/makersschedule.html&#34;&gt;Paul Graham : Maker&amp;rsquo;s Schedule, Manager&amp;rsquo;s Schedule&lt;/a&gt; (originally found &lt;a href=&#34;http://rc3.org/2009/07/24/makers-schedule-versus-managers-schedule/&#34;&gt;here&lt;/a&gt;)&lt;/p&gt;&#xA;&lt;p&gt;It was one of those mini-revelations when I found something that greatly resonated and explained an inexplicable frustration I find in the workplace sometimes.&lt;/p&gt;&#xA;&lt;p&gt;I wonder how applicable it can be to a large corporation though, rather than a start-up where it&amp;rsquo;s given that people are allowed to be &lt;a href=&#34;http://www.urbandictionary.com/define.php?term=bolshy&#34;&gt;bolshy&lt;/a&gt; with their meetings? ;-)&lt;/p&gt;</description>
    </item>
    <item>
      <title>ORA-00922: missing or invalid option</title>
      <link>https://rmoff.net/2009/07/27/ora-00922-missing-or-invalid-option/</link>
      <pubDate>Mon, 27 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/27/ora-00922-missing-or-invalid-option/</guid>
      <description>&lt;p&gt;We routinely change Oracle passwords as part of security best-practice, I keep hitting this and keep forgetting why! :-)&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-sql&#34; data-lang=&#34;sql&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;ALTER&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;USER&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;DAC_REPO&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;IDENTIFIED&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;BY&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#666&#34;&gt;1&lt;/span&gt;KoBe3RH&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;&lt;span style=&#34;color:#008000;font-weight:bold&#34;&gt;REPLACE&lt;/span&gt;&lt;span style=&#34;color:#bbb&#34;&gt; &lt;/span&gt;YlR94tqp&lt;span style=&#34;color:#bbb&#34;&gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;blockquote&gt;&#xA;&lt;p&gt;Error report: SQL Error: ORA-00922: missing or invalid option 00922. 00000 - &amp;ldquo;missing or invalid option&amp;rdquo; *Cause: *Action:&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;Someone better qualified than me can explain why but I suspect it&amp;rsquo;s the leading number in the new password. Quoting the passwords then works fine:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Metalink 3 - Do You Mean ... ?</title>
      <link>https://rmoff.net/2009/07/24/metalink-3-do-you-mean/</link>
      <pubDate>Fri, 24 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/24/metalink-3-do-you-mean/</guid>
      <description>&lt;p&gt;One of my little gripes with Metalink is its purporting to be helpful when it&amp;rsquo;s blatantly not. Here&amp;rsquo;s one: &lt;img src=&#34;https://rmoff.net/images/rnm1978/image_lost.png&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;p&gt;Now which is more likely, on Metalink 3; that I&amp;rsquo;m searching for sawserver (integral component to OBIEE), or sqlserver?!&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE Windows PerfMon counters</title>
      <link>https://rmoff.net/2009/07/24/obiee-windows-perfmon-counters/</link>
      <pubDate>Fri, 24 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/24/obiee-windows-perfmon-counters/</guid>
      <description>&lt;p&gt;Yet another way to access the BI Management data discussed &lt;a href=&#34;https://rmoff.net/2009/07/21/obiee-admin-tools-hacks/&#34;&gt;here&lt;/a&gt; - through Windows&amp;rsquo; PerfMon tool.&lt;/p&gt;&#xA;&lt;p&gt;This will only work for installations where your OBIEE server is running on Windows. You should be able to run PerfMon locally or remotely. Standard practise would be not to run it locally on a Production machine :-)&lt;/p&gt;&#xA;&lt;p&gt;To run PerfMon go to Start-&amp;gt;Run and enter perfmon, or navigate Start -&amp;gt; Settings -&amp;gt; Control Panel -&amp;gt; Administrative Tools -&amp;gt; Performance&lt;/p&gt;</description>
    </item>
    <item>
      <title>Mark Rittman&#39;s OBIEE repository for DAC</title>
      <link>https://rmoff.net/2009/07/23/mark-rittmans-obiee-repository-for-dac/</link>
      <pubDate>Thu, 23 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/23/mark-rittmans-obiee-repository-for-dac/</guid>
      <description>&lt;p&gt;&lt;a href=&#34;http://www.rittmanmead.com/2009/01/30/analyzing-bi-apps-etl-runs-using-obiee-and-the-dac-repository/&#34;&gt;Mark Rittman has an excellent article&lt;/a&gt; about querying the DAC repository database tables, including a &lt;a href=&#34;http://www.rittmanmead.com/files/DAC%20Analysis.rpd&#34;&gt;downloadable RPD file&lt;/a&gt;. Being new to working with RPDs I thought it would be good practise to explore this as well as hopefully get some useful information about our current ETL deployment.&lt;/p&gt;&#xA;&lt;p&gt;I downloaded the RPD to c:\OracleBI\server\Repository and opened it up in the Admin tool (Administrator/Administrator).&lt;br&gt;&#xA;First off I changed the connection pool to point to my DAC repository database, having setup a TNS entry for it first.&lt;/p&gt;</description>
    </item>
    <item>
      <title>psservice - Windows command line goodness!</title>
      <link>https://rmoff.net/2009/07/23/psservice-windows-command-line-goodness/</link>
      <pubDate>Thu, 23 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/23/psservice-windows-command-line-goodness/</guid>
      <description>&lt;p&gt;Our main servers are Unix and I&amp;rsquo;m as happy as a pig in muck at the command line, so when I&amp;rsquo;m working on Windows (where I&amp;rsquo;ve got a test OBIEE install) I like to stick with the CLI where possible.&lt;/p&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://technet.microsoft.com/en-us/sysinternals/bb897542.aspx&#34;&gt;PSService&lt;/a&gt; is one of those tools that I instinctively reach for without realising it. Combined with &lt;a href=&#34;http://www.launchy.net/&#34;&gt;Launchy&lt;/a&gt;, it&amp;rsquo;s even better.&lt;/p&gt;&#xA;&lt;p&gt;Simply put, you can control windows services from the command line.&lt;/p&gt;</description>
    </item>
    <item>
      <title>sawserver charts crash</title>
      <link>https://rmoff.net/2009/07/23/sawserver-charts-crash/</link>
      <pubDate>Thu, 23 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/23/sawserver-charts-crash/</guid>
      <description>&lt;p&gt;By a strange co-incidence after following &lt;a href=&#34;http://forums.oracle.com/forums/thread.jspa?threadID=931547&amp;amp;tstart=0&#34;&gt;this thread on OTN forums&lt;/a&gt; about a BI crash and struggling to understand the actual problem, I think I&amp;rsquo;ve encountered it myself!&lt;/p&gt;&#xA;&lt;p&gt;I&amp;rsquo;ve got a test install of OBIEE running on my Windows XP laptop, and whilst building a report in Answers got this:&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/image_lost.png&#34; alt=&#34;&#34;&gt;](&lt;a href=&#34;http://2.bp.blogspot.com/_RCx_EVJpczQ/SmiDrEorbbI/AAAAAAAAGcg/OPGwTLXXg8k/s1600/crash1.png)%5B!%5B%5D(/images/rnm1978/crash2.png&#34;&gt;http://2.bp.blogspot.com/_RCx_EVJpczQ/SmiDrEorbbI/AAAAAAAAGcg/OPGwTLXXg8k/s1600/crash1.png)[![](/images/rnm1978/crash2.png&lt;/a&gt;) was:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;szAppName : sawserver.exe szAppVer : 10.1.3.4 szModName : kernel32.dll szModVer : 5.1.2600.3119 offset : 000097a3&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;Going to the sawserver log at c:\OracleBIData\web\log\sawlog0.log disappointingly showed no error entries :(&lt;/p&gt;</description>
    </item>
    <item>
      <title>sawserver logging configuration - logconfig.xml</title>
      <link>https://rmoff.net/2009/07/23/sawserver-logging-configuration-logconfig-xml/</link>
      <pubDate>Thu, 23 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/23/sawserver-logging-configuration-logconfig-xml/</guid>
      <description>&lt;p&gt;The configuration of how Presentation Services (sawserver) does its logging is in the file web/config/logconfig.xml (same directory as instanceconfig.xml).&lt;/p&gt;&#xA;&lt;p&gt;It&amp;rsquo;s all nice and XML&amp;rsquo;d:&lt;/p&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/image_lost.png&#34; alt=&#34;&#34;&gt;Logging Detail&lt;br&gt;&#xA;Change the numerical values in the FilterRecord entries to alter the detail level of the logging. Lower means less detail, higher means more.&lt;/p&gt;&#xA;&lt;p&gt;Be aware that your log files can grow very rapidly if you set the logging too high, and unless you&amp;rsquo;re troubleshooting then leave them at the defaults.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Oracle BI Management / Systems Management MBeans</title>
      <link>https://rmoff.net/2009/07/22/oracle-bi-management-systems-management-mbeans/</link>
      <pubDate>Wed, 22 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/22/oracle-bi-management-systems-management-mbeans/</guid>
      <description>&lt;p&gt;Part of &lt;a href=&#34;https://rmoff.net/2009/07/21/obiee-admin-tools-hacks/&#34;&gt;looking at the various gubbins inside OBIEE&lt;/a&gt; led me to realise that the Oracle BI Management application drives quite a few things. It exposes MBeans (Management Beans, a java term), accessible through &lt;a href=&#34;http://en.wikipedia.org/wiki/JMX&#34;&gt;jmx&lt;/a&gt;. In the installation of OBIEE this component is referred to as &amp;ldquo;Systems Management&amp;rdquo;.&lt;/p&gt;&#xA;&lt;p&gt;The MBeans give us real-time performance information, along with access to all the configuration options that are normally done through config files (instanceconfig.xml etc). Bear in mind if using it for updating configuration instead of through the files you don&amp;rsquo;t get any backup created, so for that reason alone I would suggest it should only be used for reading current values.&lt;/p&gt;</description>
    </item>
    <item>
      <title>JConsole / JMX - followup</title>
      <link>https://rmoff.net/2009/07/21/jconsole-jmx-followup/</link>
      <pubDate>Tue, 21 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/21/jconsole-jmx-followup/</guid>
      <description>&lt;p&gt;A few points to add to my &lt;a href=&#34;https://rmoff.net/2009/07/16/jconsole-jmx/&#34;&gt;previous posting on JConsole&lt;/a&gt;:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;As well as performance data, you have access to configuration data. Be aware that it is read-write! So whilst it might be a nice alternative to digging around for your instanceconfig.xml etc, you should be careful&lt;/li&gt;&#xA;&lt;li&gt;If you have your BI Server and Presentation Services deployed on separate servers then you will only get MBeans for the relevant service:&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/image_lost.png&#34; alt=&#34;&#34;&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE admin tools &amp; hacks</title>
      <link>https://rmoff.net/2009/07/21/obiee-admin-tools-hacks/</link>
      <pubDate>Tue, 21 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/21/obiee-admin-tools-hacks/</guid>
      <description>&lt;p&gt;As a kid I loved the idea of lego where you can disassemble and reassemble something from the ground up. As soon as I got my hands on a computer it was the same. You can have your Acorn Archimedes with its games, where do I find the sprites and sound files behind it? Likewise Microsoft Word, let me at the VBA underneath to hack it around and see what else it can do.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OTN forum rant</title>
      <link>https://rmoff.net/2009/07/21/otn-forum-rant/</link>
      <pubDate>Tue, 21 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/21/otn-forum-rant/</guid>
      <description>&lt;p&gt;I read and post a bit on the &lt;a href=&#34;http://forums.oracle.com/forums/forum.jspa?forumID=378&#34;&gt;OBIEE&lt;/a&gt; and &lt;a href=&#34;http://forums.oracle.com/forums/forum.jspa?forumID=410&amp;amp;start=0&#34;&gt;OBIA&lt;/a&gt; OTN forums. The noise ratio isn&amp;rsquo;t too bad, but a few things really get my goat:&lt;/p&gt;&#xA;&lt;ol&gt;&#xA;&lt;li&gt;&#xA;&lt;p&gt;Not responding to answers!&lt;br&gt;&#xA;If I&amp;rsquo;ve gone out of my way to help, or try to help, at least have the courtesy to acknowledge it, and ideally mark as Helpful or Correct as appropriate.&lt;br&gt;&#xA;Even a simple &amp;ldquo;thanks.&amp;rdquo; would do. It&amp;rsquo;s just good manners.&lt;br&gt;&#xA;It also helps people who come along afterwards as you get the completed picture rather than half a question/answer thread.&lt;br&gt;&#xA;&lt;a href=&#34;http://forums.oracle.com/forums/ann.jspa?annID=939&#34;&gt;Forums Etiquette&lt;/a&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>JConsole / JMX</title>
      <link>https://rmoff.net/2009/07/16/jconsole-jmx/</link>
      <pubDate>Thu, 16 Jul 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/07/16/jconsole-jmx/</guid>
      <description>&lt;p&gt;[edit] See &lt;a href=&#34;https://rmoff.net/2009/07/21/jconsole-jmx-followup/&#34;&gt;this post&lt;/a&gt; too [/edit] On an OBIEE server run&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;nohup obiee/systemsmanagement/runagent.sh &amp;amp;amp;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;and then run &lt;strong&gt;jconsole&lt;/strong&gt; (make sure you&amp;rsquo;ve set the DISPLAY first if you&amp;rsquo;re running it from UNIX). NB: if you don&amp;rsquo;t have jconsole in your path you can search for it:&lt;/p&gt;&#xA;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; style=&#34;;-moz-tab-size:4;-o-tab-size:4;tab-size:4;-webkit-text-size-adjust:none;&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;&lt;span style=&#34;color:#19177c&#34;&gt;$whereis&lt;/span&gt; jconsole&#xA;&lt;/span&gt;&lt;/span&gt;&lt;span style=&#34;display:flex;&#34;&gt;&lt;span&gt;jconsole: /opt/java1.5/bin/jconsole /opt/java6/bin/jconsole&amp;amp;lt;/span&amp;amp;gt;&#xA;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;You should find it under your java/bin directory&lt;/p&gt;&#xA;&lt;p&gt;You should get this kind of connection dialog: &lt;a href=&#34;https://rmoff.net/images/2009/07/jconsole_connect.webp&#34;&gt;&lt;img src=&#34;https://rmoff.net/images/2009/07/jconsole_connect.webp&#34; alt=&#34;&#34;&gt;&lt;/a&gt; Click connect, and the console will launch. From here click on the MBeans tab, where you&amp;rsquo;ve got access to performance and configuration data &lt;a href=&#34;https://rmoff.net/images/2009/07/jconsole.webp&#34;&gt;&lt;img src=&#34;https://rmoff.net/images/2009/07/jconsole.webp&#34; alt=&#34;&#34;&gt;&lt;/a&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>nqcmd and [nQSError: 27005] Unresolved column</title>
      <link>https://rmoff.net/2009/05/28/nqcmd-and-nqserror-27005-unresolved-column/</link>
      <pubDate>Thu, 28 May 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/05/28/nqcmd-and-nqserror-27005-unresolved-column/</guid>
      <description>&lt;p&gt;I&amp;rsquo;m working on a scripted load test for OBIEE using nqcmd to run reports multiple times. I hit this interesting issue.&lt;/p&gt;&#xA;&lt;p&gt;Cut and pasting the logical SQL that Presentation Services sends to BI Server from Manage Sessions -&amp;gt; Statement, I kept getting this error when I ran it through nqcmd:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;[10058][State: S1000] [NQODBC] [SQL_STATE: S1000] [nQSError: 10058] A general error has occurred.&lt;br&gt;&#xA;[nQSError: 27005] Unresolved column: &amp;ldquo;Natural Account (COA)&amp;rdquo;.&amp;ldquo;Account Parent1 Code&amp;rdquo;.&lt;br&gt;&#xA;Statement preparation failed&lt;/p&gt;</description>
    </item>
    <item>
      <title>Custom HTTP error page in OBIEE / OAS</title>
      <link>https://rmoff.net/2009/05/18/custom-http-error-page-in-obiee-oas/</link>
      <pubDate>Mon, 18 May 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/05/18/custom-http-error-page-in-obiee-oas/</guid>
      <description>&lt;p&gt;It&amp;rsquo;s possible to change the error pages served up by OAS/Apache by using the ErrorDocument directive. This is &lt;a href=&#34;http://httpd.apache.org/docs/1.3/mod/core.html#errordocument&#34;&gt;widely documented&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;p&gt;However, to get this to take effect in an oc4j application (such as analytics) you need to change mod_oc4j.conf too.&lt;br&gt;&#xA;(I found this out from this post &lt;a href=&#34;http://jhelvoort.wordpress.com/2009/01/13/errordocument-fails-to-intercept-internal-500-error/&#34;&gt;here&lt;/a&gt;)&lt;/p&gt;&#xA;&lt;p&gt;Take backups of httpd.conf and mod_oc4j.conf, and then edit them as follows:&lt;/p&gt;&#xA;&lt;p&gt;In httpd.conf add:&lt;br&gt;&#xA;ErrorDocument 500 /500.html&lt;br&gt;&#xA;where /500.html is a relative path to your custom document&lt;/p&gt;</description>
    </item>
    <item>
      <title>New releases</title>
      <link>https://rmoff.net/2009/04/28/new-releases/</link>
      <pubDate>Tue, 28 Apr 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/04/28/new-releases/</guid>
      <description>&lt;p&gt;New releases this week - 10.1.3.4.1 of OBIEE, and 7.9.6 of OBIA.&lt;/p&gt;&#xA;&lt;p&gt;After a bit of scrabbling around found:&lt;br&gt;&#xA;&lt;a href=&#34;http://download.oracle.com/docs/cd/E10415_01/doc/bi.1013/e10416/general_101341.htm#BABCFCIA&#34;&gt;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;&lt;a href=&#34;http://download.oracle.com/docs/cd/E10415_01/doc/bi.1013/e10416/general_101341.htm#BABCFCIA&#34;&gt;&amp;ldquo;The 10.1.3.4.1 release of the Oracle Business Intelligence Enterprise Edition introduces no new features.&amp;rdquo;&lt;/a&gt;&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;Though there are some &lt;a href=&#34;http://download.oracle.com/docs/cd/E12844_01/doc/bip.1013/e14667/toc.htm&#34;&gt;new bits and pieces for Publisher&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;For 7.9.6 OBIA, there&amp;rsquo;s no New Features document :(&lt;br&gt;&#xA;By piecing together the &amp;ldquo;What&amp;rsquo;s New in This Release&amp;rdquo; for each document you can build up a picture (eg in the &lt;a href=&#34;http://download.oracle.com/docs/cd/E14223_01/bia.796/e14218/toc.htm&#34;&gt;Upgrade Guide&lt;/a&gt; there&amp;rsquo;s reference to changes for the doc relating to a new version of Informatica), but it would be nice to have it all in one place:&lt;/p&gt;</description>
    </item>
    <item>
      <title>OBIEE and F5 BIG-IP</title>
      <link>https://rmoff.net/2009/04/15/obiee-and-f5-big-ip/</link>
      <pubDate>Wed, 15 Apr 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/04/15/obiee-and-f5-big-ip/</guid>
      <description>&lt;p&gt;We&amp;rsquo;ve got a setup of two OAS/Presentation Services boxes and two BI Server boxes, with load balancing/failover throughout.&lt;br&gt;&#xA;The Load Balancing of the web requests is being done through a separate bit of kit, an F5 BIG-IP load balancer. This directs the requests at the two OAS servers.&lt;/p&gt;&#xA;&lt;p&gt;The problem we have is that by default OAS serves HTTP on port 7777, but the F5 is using port 80. A request for our load balanced URL: &lt;a href=&#34;http://bi.mycompany.com/analytics/&#34;&gt;http://bi.mycompany.com/analytics/&lt;/a&gt; barfs out with&lt;/p&gt;</description>
    </item>
    <item>
      <title>Google and Korean OTN forums</title>
      <link>https://rmoff.net/2009/04/02/google-and-korean-otn-forums/</link>
      <pubDate>Thu, 02 Apr 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/04/02/google-and-korean-otn-forums/</guid>
      <description>&lt;p&gt;Why does Google more often than not return the korean (kr) version of OTN forums for a match, but not english?&lt;/p&gt;&#xA;&lt;p&gt;If I &lt;a href=&#34;http://www.google.co.uk/search?hl=en&amp;amp;client=firefox-a&amp;amp;rls=org.mozilla%3Aen-GB%3Aofficial&amp;amp;hs=14e&amp;amp;num=30&amp;amp;q=%22Send+notification%3A%22+%22oracle.ons.Notification%22&amp;amp;btnG=Search&amp;amp;meta=&#34;&gt;search Google for &amp;ldquo;Send notification:&amp;rdquo; &amp;ldquo;oracle.ons.Notification&amp;rdquo;&lt;/a&gt; the top hit is for &lt;a href=&#34;http://kr.forums.oracle.com/forums/thread.jspa?threadID=650662&#34;&gt;kr.forums.oracle.com/forums/thread.jspa?threadID=650662&lt;/a&gt;&lt;/p&gt;&#xA;&lt;p&gt;If you strip the kr from the URL you get &lt;a href=&#34;http://forums.oracle.com/forums/thread.jspa?threadID=650662&#34;&gt;this&lt;/a&gt;, which makes more sense given the locality when doing the search in Google.&lt;/p&gt;&#xA;&lt;p&gt;Only mildly irritating, but odd behaviour nonetheless&lt;/p&gt;</description>
    </item>
    <item>
      <title>OAS bug 7132128 - Send notification: oracle.ons.Notification</title>
      <link>https://rmoff.net/2009/04/02/oas-bug-7132128-send-notification-oracle-ons-notification/</link>
      <pubDate>Thu, 02 Apr 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/04/02/oas-bug-7132128-send-notification-oracle-ons-notification/</guid>
      <description>&lt;p&gt;I noticed that the j2ee server.log file was filling up with these entries:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;oracle.ons.Notification@afba5d&lt;br&gt;&#xA;09/04/02 10:15:12.207 Send notification:&lt;br&gt;&#xA;oracle.ons.Notification@17ca0f5&lt;br&gt;&#xA;09/04/02 10:15:42.217 Send notification:&lt;br&gt;&#xA;oracle.ons.Notification@1a28842&lt;br&gt;&#xA;09/04/02 10:16:12.227 Send notification:&lt;br&gt;&#xA;oracle.ons.Notification@5144d5&lt;br&gt;&#xA;09/04/02 10:16:42.237 Send notification:&lt;br&gt;&#xA;oracle.ons.Notification@19078ed&lt;br&gt;&#xA;09/04/02 10:17:12.247 Send notification:&lt;br&gt;&#xA;oracle.ons.Notification@fcc268&lt;br&gt;&#xA;09/04/02 10:17:42.257 Send notification:&lt;br&gt;&#xA;oracle.ons.Notification@16df388&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;A quick google turned up &lt;a href=&#34;http://forums.oracle.com/forums/thread.jspa?messageID=2550518&amp;amp;#2550518&#34;&gt;this page&lt;/a&gt; in which bug 7132128 (&amp;ldquo;OC4J EMITS &amp;ldquo;SEND NOTIFICATION&amp;rdquo; MESSAGES THAT FILL SERVER.LOG&amp;rdquo;) is identified.&lt;/p&gt;&#xA;&lt;p&gt;The server.log is currently only 6MB and given the size of the server it will be a while before it causes us problems, but it&amp;rsquo;s worth being aware of.&lt;/p&gt;</description>
    </item>
    <item>
      <title>OAS makes you log in twice</title>
      <link>https://rmoff.net/2009/04/02/oas-makes-you-log-in-twice/</link>
      <pubDate>Thu, 02 Apr 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/04/02/oas-makes-you-log-in-twice/</guid>
      <description>&lt;p&gt;A very minor irritation, but an irritation nonetheless, is when I go to Application Server Control in OAS I have to login twice.&lt;/p&gt;&#xA;&lt;p&gt;Reading around I found this is an Apache feature, and is actually designed behaviour.&lt;/p&gt;&#xA;&lt;p&gt;For reasons I&amp;rsquo;ve not explored our servers have several different hostnames which resolve to the same IP, e.g.:&lt;br&gt;&#xA;myserver&lt;br&gt;&#xA;myserver-app&lt;br&gt;&#xA;myserver-data&lt;/p&gt;&#xA;&lt;p&gt;When you request a page from Apache using a hostname other than that configured as ServerName in Apache&amp;rsquo;s httpd.conf, it redirects you to the version of the page using the ServerName.&lt;/p&gt;</description>
    </item>
    <item>
      <title>sawserver won&#39;t start up - resolved</title>
      <link>https://rmoff.net/2009/04/01/sawserver-wont-start-up-resolved/</link>
      <pubDate>Wed, 01 Apr 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/04/01/sawserver-wont-start-up-resolved/</guid>
      <description>&lt;p&gt;(See &lt;a href=&#34;https://rmoff.net/2009/03/30/sawserver-wont-start-analytics-servlet-error-java-net-connectexception-connection-refused-errno239/&#34;&gt;here&lt;/a&gt; and &lt;a href=&#34;https://rmoff.net/2009/04/01/troubleshooting-an-hpux-program/&#34;&gt;here&lt;/a&gt; for history)&lt;/p&gt;&#xA;&lt;p&gt;I edited the shell script which is eventually called by run-saw.sh to start the sawserver, (OracleBI)/setup/sawserver.sh, to use trus:&lt;br&gt;&#xA;Comment out the final line:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;$SASAWSERVER&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;and insert as a new line:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;tusc -fepan -o /tmp/sawserver_tusc.out $SASAWSERVER&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;The output of trus ended with this:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;open(&amp;quot;/app/oracle/product/10.2.0/lib/libstd_v2.so.1&amp;quot;, O_RDONLY|0x800, 0) &amp;hellip;&amp;hellip;&amp;hellip; ERR#2 ENOENT&lt;br&gt;&#xA;open(&amp;quot;/app/oracle/product/obiee/server/Bin64/libstd_v2.so.1&amp;quot;, O_RDONLY|0x800, 0) . ERR#2 ENOENT&lt;br&gt;&#xA;open(&amp;quot;/app/oracle/product/obiee/web/bin64/libstd_v2.so.1&amp;quot;, O_RDONLY|0x800, 0) &amp;hellip;. ERR#2 ENOENT&lt;br&gt;&#xA;open(&amp;quot;/app/oracle/product/obiee/odbc/lib64/libstd_v2.so.1&amp;quot;, O_RDONLY|0x800, 0) &amp;hellip; ERR#2 ENOENT&lt;br&gt;&#xA;open(&amp;quot;/usr/lib/libstd_v2.so.1&amp;quot;, O_RDONLY|0x800, 0) &amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;. ERR#2 ENOENT&lt;br&gt;&#xA;open(&amp;quot;/lib/libstd_v2.so.1&amp;quot;, O_RDONLY|0x800, 0) &amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;.. ERR#2 ENOENT&lt;br&gt;&#xA;open(&amp;quot;/app/oracle/product/10.2.0/lib/libstd_v2.so.1&amp;quot;, O_RDONLY|0x800, 0) &amp;hellip;&amp;hellip;&amp;hellip; ERR#2 ENOENT&lt;br&gt;&#xA;open(&amp;quot;/opt/aCC/lib/hpux64/libstd_v2.so.1&amp;quot;, O_RDONLY|0x800, 0) &amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;&amp;hellip;.. ERR#2 ENOENT&lt;/p&gt;</description>
    </item>
    <item>
      <title>Troubleshooting an HPUX program</title>
      <link>https://rmoff.net/2009/04/01/troubleshooting-an-hpux-program/</link>
      <pubDate>Wed, 01 Apr 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/04/01/troubleshooting-an-hpux-program/</guid>
      <description>&lt;p&gt;In investigating the &lt;a href=&#34;https://rmoff.net/2009/03/30/sawserver-wont-start-analytics-servlet-error-java-net-connectexception-connection-refused-errno239/&#34;&gt;problems with sawserver&lt;/a&gt; I was pointed towards a tool called &lt;a href=&#34;http://hpux.connect.org.uk/hppd/hpux/Sysadmin/tusc-7.9/&#34;&gt;tusc&lt;/a&gt; (which appears to be an HP version of truss).&lt;/p&gt;&#xA;&lt;p&gt;You can use it to invoke a program, and get out a bunch of debug information including system calls.&lt;/p&gt;&#xA;&lt;p&gt;You run it like this:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;$tusc -fep /app/oracle/product/obiee/web/bin64/sawserver64&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;As a beginner when it comes to hardcore *nix I can only look at this and take pot shots at what&amp;rsquo;s going on, but with Google by my side I&amp;rsquo;m interested in the last lines of the output:&lt;/p&gt;</description>
    </item>
    <item>
      <title>Bug in Clustered Publisher Scheduler - ClusterManager: detected 1 failed or restarted instances</title>
      <link>https://rmoff.net/2009/03/30/bug-in-clustered-publisher-scheduler-clustermanager-detected-1-failed-or-restarted-instances/</link>
      <pubDate>Mon, 30 Mar 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/03/30/bug-in-clustered-publisher-scheduler-clustermanager-detected-1-failed-or-restarted-instances/</guid>
      <description>&lt;p&gt;Follow on from &lt;a href=&#34;https://rmoff.net/2009/03/24/clustering-publisher-scheduler-and-report-repository/&#34;&gt;setting up Publisher in a clustered environment&lt;/a&gt;, I&amp;rsquo;ve found a nasty little bug in the scheduling element of Publisher, Quartz.&lt;/p&gt;&#xA;&lt;p&gt;Looking at the oc4j log file /opmn/logs/default_group&lt;del&gt;home&lt;/del&gt;default_group~1.log I can see OC4J starting up, and then a whole load of repeated messages:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;09/03/30 11:28:43 Oracle Containers for J2EE 10g (10.1.3.3.0) initialized&lt;br&gt;&#xA;- ClusterManager: detected 1 failed or restarted instances.&lt;br&gt;&#xA;- ClusterManager: Scanning for instance &amp;ldquo;myserver.fqdn.company.net1238408921404&amp;rdquo;&amp;rsquo;s failed in-progress jobs.&lt;br&gt;&#xA;- ClusterManager: detected 1 failed or restarted instances.&lt;br&gt;&#xA;- ClusterManager: Scanning for instance &amp;ldquo;myserver.fqdn.company.net1238408921404&amp;rdquo;&amp;rsquo;s failed in-progress jobs.&lt;br&gt;&#xA;- ClusterManager: detected 1 failed or restarted instances.&lt;br&gt;&#xA;- ClusterManager: Scanning for instance &amp;ldquo;myserver.fqdn.company.net1238408921404&amp;rdquo;&amp;rsquo;s failed in-progress jobs.&lt;br&gt;&#xA;- ClusterManager: detected 1 failed or restarted instances.&lt;br&gt;&#xA;- ClusterManager: Scanning for instance &amp;ldquo;myserver.fqdn.company.net1238408921404&amp;rdquo;&amp;rsquo;s failed in-progress jobs.&lt;br&gt;&#xA;- ClusterManager: detected 1 failed or restarted instances.&lt;br&gt;&#xA;- ClusterManager: Scanning for instance &amp;ldquo;myserver.fqdn.company.net1238408921404&amp;rdquo;&amp;rsquo;s failed in-progress jobs.&lt;br&gt;&#xA;- ClusterManager: detected 1 failed or restarted instances.&lt;br&gt;&#xA;- ClusterManager: Scanning for instance &amp;ldquo;myserver.fqdn.company.net1238408921404&amp;rdquo;&amp;rsquo;s failed in-progress jobs.&lt;br&gt;&#xA;- ClusterManager: detected 1 failed or restarted instances.&lt;br&gt;&#xA;- ClusterManager: Scanning for instance &amp;ldquo;myserver.fqdn.company.net1238408921404&amp;rdquo;&amp;rsquo;s failed in-progress jobs.&lt;br&gt;&#xA;- ClusterManager: detected 1 failed or restarted instances.&lt;br&gt;&#xA;- ClusterManager: Scanning for instance &amp;ldquo;myserver.fqdn.company.net1238408921404&amp;rdquo;&amp;rsquo;s failed in-progress jobs.&lt;br&gt;&#xA;[&amp;hellip; repeated for 38MB worth ]&lt;/p&gt;</description>
    </item>
    <item>
      <title>sawserver won&#39;t start (analytics: Servlet error java.net.ConnectException: Connection refused (errno:239))</title>
      <link>https://rmoff.net/2009/03/30/sawserver-wont-start-analytics-servlet-error-java-net-connectexception-connection-refused-errno239/</link>
      <pubDate>Mon, 30 Mar 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/03/30/sawserver-wont-start-analytics-servlet-error-java-net-connectexception-connection-refused-errno239/</guid>
      <description>&lt;p&gt;We&amp;rsquo;re getting this error in the Presentation Services plug-in [analytics].&lt;br&gt;&#xA;Log file: /j2ee/home/application-deployments/analytics/home_default_group_1/application.log&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;09/03/30 13:16:38.75 analytics: Servlet error&lt;br&gt;&#xA;java.net.ConnectException: Connection refused (errno:239)&lt;br&gt;&#xA;at java.net.PlainSocketImpl.socketConnect(Native Method)&lt;br&gt;&#xA;at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333)&lt;br&gt;&#xA;at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195)&lt;br&gt;&#xA;at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182)&lt;br&gt;&#xA;at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366)&lt;br&gt;&#xA;at java.net.Socket.connect(Socket.java:517)&lt;br&gt;&#xA;at java.net.Socket.connect(Socket.java:467)&lt;br&gt;&#xA;at java.net.Socket.(Socket.java:364)&lt;br&gt;&#xA;at java.net.Socket.(Socket.java:178)&lt;br&gt;&#xA;at com.siebel.analytics.web.sawconnect.ConnectionPoolSocketFactoryImpl.createSocket(ConnectionPoolSocketFactoryImpl.java:63)&lt;br&gt;&#xA;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;br&gt;&#xA;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)&lt;br&gt;&#xA;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)&lt;br&gt;&#xA;at java.lang.reflect.Method.invoke(Method.java:585)&lt;br&gt;&#xA;at com.siebel.analytics.web.sawconnect.ConnectionPoolSocketFactoryImpl.createSocket(ConnectionPoolSocketFactoryImpl.java:70)&lt;br&gt;&#xA;at com.siebel.analytics.web.sawconnect.ConnectionPool.createNewConnection(ConnectionPool.java:314)&lt;br&gt;&#xA;at com.siebel.analytics.web.sawconnect.ConnectionPool.getConnection(ConnectionPool.java:133)&lt;br&gt;&#xA;at com.siebel.analytics.web.SAWBridge.processRequest(SAWBridge.java:299)&lt;br&gt;&#xA;at com.siebel.analytics.web.SAWBridge.doGet(SAWBridge.java:325)&lt;br&gt;&#xA;at javax.servlet.http.HttpServlet.service(HttpServlet.java:743)&lt;br&gt;&#xA;at javax.servlet.http.HttpServlet.service(HttpServlet.java:856)&lt;br&gt;&#xA;at com.evermind[Oracle Containers for J2EE 10g (10.1.3.3.0) ].server.http.ServletRequestDispatcher.invoke(ServletRequestDispatcher.java:713)&lt;br&gt;&#xA;at com.evermind[Oracle Containers for J2EE 10g (10.1.3.3.0) ].server.http.ServletRequestDispatcher.forwardInternal(ServletRequestDispatcher.java:3&lt;br&gt;&#xA;70)&lt;br&gt;&#xA;at com.evermind[Oracle Containers for J2EE 10g (10.1.3.3.0) ].server.http.HttpRequestHandler.doProcessRequest(HttpRequestHandler.java:871)&lt;br&gt;&#xA;at com.evermind[Oracle Containers for J2EE 10g (10.1.3.3.0) ].server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:453)&lt;br&gt;&#xA;at com.evermind[Oracle Containers for J2EE 10g (10.1.3.3.0) ].server.http.AJPRequestHandler.run(AJPRequestHandler.java:302)&lt;br&gt;&#xA;at com.evermind[Oracle Containers for J2EE 10g (10.1.3.3.0) ].server.http.AJPRequestHandler.run(AJPRequestHandler.java:190)&lt;br&gt;&#xA;at oracle.oc4j.network.ServerSocketReadHandler$SafeRunnable.run(ServerSocketReadHandler.java:260)&lt;br&gt;&#xA;at oracle.oc4j.network.ServerSocketAcceptHandler.procClientSocket(ServerSocketAcceptHandler.java:239)&lt;br&gt;&#xA;at oracle.oc4j.network.ServerSocketAcceptHandler.access$700(ServerSocketAcceptHandler.java:34)&lt;br&gt;&#xA;at oracle.oc4j.network.ServerSocketAcceptHandler$AcceptHandlerHorse.run(ServerSocketAcceptHandler.java:880)&lt;br&gt;&#xA;at com.evermind[Oracle Containers for J2EE 10g (10.1.3.3.0) ].util.ReleasableResourcePooledExecutor$MyWorker.run(ReleasableResourcePooledExecutor.&lt;br&gt;&#xA;java:303)&lt;br&gt;&#xA;at java.lang.Thread.run(Thread.java:595)&lt;/p&gt;</description>
    </item>
    <item>
      <title>ODI Server install - missing odiparams.sh file</title>
      <link>https://rmoff.net/2009/03/27/odi-server-install-missing-odiparams-sh-file/</link>
      <pubDate>Fri, 27 Mar 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/03/27/odi-server-install-missing-odiparams-sh-file/</guid>
      <description>&lt;p&gt;I&amp;rsquo;m installing ODI agent on our database server using OUI. I selected the &amp;ldquo;Server&amp;rdquo; option at install time to get the Agent only, but looking in oracledi/bin odiparams.sh is missing:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;$ls -l *.sh&lt;br&gt;&#xA;-rwxrwxrwx   1 odiadm     dba            685 Nov 21 15:58 agent.sh&lt;br&gt;&#xA;-rwxrwxrwx   1 odiadm     dba            908 Nov 21 15:58 agentscheduler.sh&lt;br&gt;&#xA;-rwxrwxrwx   1 odiadm     dba            707 Nov 21 15:58 agentstop.sh&lt;br&gt;&#xA;-rwxrwxrwx   1 odiadm     dba            941 Nov 21 15:58 agentweb.sh&lt;br&gt;&#xA;-rwxrwxrwx   1 odiadm     dba            724 Nov 21 15:58 jython.sh&lt;/p&gt;</description>
    </item>
    <item>
      <title>Remove windows line feed characters in vi</title>
      <link>https://rmoff.net/2009/03/27/remove-windows-line-feed-characters-in-vi/</link>
      <pubDate>Fri, 27 Mar 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/03/27/remove-windows-line-feed-characters-in-vi/</guid>
      <description>&lt;p&gt;If you work with a file in Windows and Unix at some point you might end up with windows line feed characters in your Unix file. It&amp;rsquo;ll look like this:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;one line of text ^M&lt;br&gt;&#xA;next line ^M&lt;br&gt;&#xA;and next line with more ^M&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;To remove the ^M character, load the file into vi on unix and enter as a line command the following:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;:1,$s/^M//&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;but instead of typing ^M do Ctrl-V Ctrl-M to get the charaters&lt;/p&gt;</description>
    </item>
    <item>
      <title>ORA-12537 / ORA-12518 [Informatica DAC error CMN_1022]</title>
      <link>https://rmoff.net/2009/03/25/ora-12537-ora-12518-informatica-dac-error-cmn_1022/</link>
      <pubDate>Wed, 25 Mar 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/03/25/ora-12537-ora-12518-informatica-dac-error-cmn_1022/</guid>
      <description>&lt;p&gt;We&amp;rsquo;re getting problems with an instance of Informatica / out-of-the-box OBIA on a new set of servers. When we run the execution plan we get this error soon after starting:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;MAPPING&amp;gt; DBG_21075 Connecting to database [TNSENTRY], user [MYUSER]&lt;br&gt;&#xA;MAPPING&amp;gt; CMN_1761 Timestamp Event: [Tue Mar 24 18:56:33 2009]&lt;br&gt;&#xA;MAPPING&amp;gt; CMN_1022 Database driver error&amp;hellip;&lt;br&gt;&#xA;CMN_1022 [&lt;br&gt;&#xA;Database driver error&amp;hellip;&lt;br&gt;&#xA;Function Name : Logon&lt;br&gt;&#xA;ORA-12537: TNS:connection closed&lt;/p&gt;&#xA;&lt;p&gt;Database driver error&amp;hellip;&lt;br&gt;&#xA;Function Name : Connect&lt;br&gt;&#xA;Database Error: Failed to connect to database using user [MYUSER] and connection string [TNSENTRY].]&lt;/p&gt;</description>
    </item>
    <item>
      <title>Clustering Publisher - Scheduler and Report Repository</title>
      <link>https://rmoff.net/2009/03/24/clustering-publisher-scheduler-and-report-repository/</link>
      <pubDate>Tue, 24 Mar 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/03/24/clustering-publisher-scheduler-and-report-repository/</guid>
      <description>&lt;p&gt;The &lt;a href=&#34;http://www.oracle.com/technology/products/xml-publisher/docs/BIP_HA.pdf&#34;&gt;Oracle BI Publisher Enterprise Cluster Deployment&lt;/a&gt; doc which I just found through Metalink highlighted a couple of points:&lt;br&gt;&#xA;- Report repository should be shared&lt;br&gt;&#xA;- The scheduler should be configured for a cluster&lt;/p&gt;&#xA;&lt;p&gt;Report Repository&lt;br&gt;&#xA;Through Admin&amp;gt;System Maintenance&amp;gt;Report Repository I changed the path from the default, /xmlp/XMLP to a NFS mount data/shared/xmlp and restarted the xmlpserver application in OAS. On coming back up Publisher complained because all its config files (in xmlp/Admin), had disappeared. I&amp;rsquo;d not moved any of the contents of /xmlp/XMLP since Report Repository suggested to me that it was just for reports, ergo with no reports yet created there was nothing to move.&lt;br&gt;&#xA;So pedantaries aside, I moved the contents of /xmlp/XMLP to my new share, data/shared/xmlp. Publisher was happy after this.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Firefox add-ins - ones I find useful</title>
      <link>https://rmoff.net/2009/03/24/firefox-add-ins-ones-i-find-useful/</link>
      <pubDate>Tue, 24 Mar 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/03/24/firefox-add-ins-ones-i-find-useful/</guid>
      <description>&lt;p&gt;&lt;img src=&#34;https://rmoff.net/images/rnm1978/image_lost.png&#34; alt=&#34;&#34;&gt;&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Delicious Bookmarks&lt;/li&gt;&#xA;&lt;li&gt;FireGestures&lt;/li&gt;&#xA;&lt;li&gt;FoxyProxy&lt;/li&gt;&#xA;&lt;li&gt;Greasemonkey&lt;/li&gt;&#xA;&lt;li&gt;HttpFox&lt;/li&gt;&#xA;&lt;li&gt;Screengrab&lt;/li&gt;&#xA;&lt;li&gt;User Agent Switcher&lt;/li&gt;&#xA;&lt;/ul&gt;</description>
    </item>
    <item>
      <title>Metalink Metalink Metalink</title>
      <link>https://rmoff.net/2009/03/24/metalink-metalink-metalink/</link>
      <pubDate>Tue, 24 Mar 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/03/24/metalink-metalink-metalink/</guid>
      <description>&lt;p&gt;I&amp;rsquo;m learning about a lot of the Oracle BI stack through reading manuals and trial-and-error.&lt;br&gt;&#xA;One thing I&amp;rsquo;ve realised is that Metalink holds a whole heap of useful information.&lt;br&gt;&#xA;For example, simply searching on &amp;ldquo;publisher cluster&amp;rdquo; throws up these two very pertinent docs:&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;OBIEE Clustered Installation with BI Publisher (Doc ID 744515.1)&lt;/li&gt;&#xA;&lt;li&gt;BI Publisher does not accept cluster jdbc connection strings (Doc ID 559795.1)&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;The first one is a &lt;a href=&#34;http://www.oracle.com/technology/products/xml-publisher/docs/BIP_HA.pdf&#34;&gt;publically available PDF&lt;/a&gt;, the second one is the answer to the problem I spent more time than I needed to &lt;a href=&#34;https://rmoff.net/2009/03/23/obiee-publisher-configuring-connection-to-clustered-bi-server/&#34;&gt;scratching my head over yesterday&lt;/a&gt;.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Which jdbc driver to use</title>
      <link>https://rmoff.net/2009/03/24/which-jdbc-driver-to-use/</link>
      <pubDate>Tue, 24 Mar 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/03/24/which-jdbc-driver-to-use/</guid>
      <description>&lt;p&gt;In setting the scheduler in Publisher I discovered a useful difference in jdbc drivers.&lt;br&gt;&#xA;Our repository is on Oracle 11g.&lt;br&gt;&#xA;According to the manual oracle.jdbc.driver.OracleDriver should be used, but previous installations have used oracle.bi.jdbc.AnaJdbcDriver so I tried this too.&lt;/p&gt;&#xA;&lt;p&gt;In experimenting with both I found you get more useful feedback from the second one. Here&amp;rsquo;s the same problem reported by both drivers:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;· Exception [TOPLINK-4002] (Oracle TopLink - 11g Release 1 (11.1.1.0.0) (Build 080319)): oracle.toplink.exceptions.DatabaseException Internal Exception: java.sql.SQLException: ORA-28000: the account is locked Error Code: 28000&lt;/p&gt;</description>
    </item>
    <item>
      <title>Finding config files in unix</title>
      <link>https://rmoff.net/2009/03/23/finding-config-files-in-unix/</link>
      <pubDate>Mon, 23 Mar 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/03/23/finding-config-files-in-unix/</guid>
      <description>&lt;p&gt;Following my previous work on configuring Publisher, I wanted to note down where the changes were written to.&lt;/p&gt;&#xA;&lt;p&gt;The -mname syntax of the unix find command comes in handy here:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;find /app/oracle/product/obiee -mtime -1&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;p&gt;Shows me all files under the specified path which were modified in the last 1 day&lt;/p&gt;&#xA;&lt;p&gt;and helpfully throws up:&lt;/p&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;/app/oracle/product/obiee/xmlp/XMLP/Admin/DataSource/datasources.xml&lt;/p&gt;&#xA;&lt;/blockquote&gt;</description>
    </item>
    <item>
      <title>OBIEE Publisher - configuring connection to clustered BI Server</title>
      <link>https://rmoff.net/2009/03/23/obiee-publisher-configuring-connection-to-clustered-bi-server/</link>
      <pubDate>Mon, 23 Mar 2009 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/2009/03/23/obiee-publisher-configuring-connection-to-clustered-bi-server/</guid>
      <description>&lt;p&gt;I&amp;rsquo;m setting up a clustered OBIEE 10.1.3.4 production environment. There are four servers; two BI Server + Cluster Controller + Scheduler and two OAS + Presentation Services + Publisher. Clustering of BI is configured, now I&amp;rsquo;m setting up the other bits. Today is Publisher.&lt;/p&gt;&#xA;&lt;p&gt;On publisher instance A connections to the BI Servers directly work fine:&lt;br&gt;&#xA;jdbc:oraclebi://serverA.fqdn.company.net:9703/ jdbc:oraclebi://serverB.fqdn.company.net:9703/&lt;br&gt;&#xA;both work individually as Connection Strings (with database driver class of oracle.bi.jdbc.AnaJdbcDriver) - verified with &amp;ldquo;Test Connection&amp;rdquo; button.&lt;br&gt;&#xA;Connections also work when specifying the hostname only (i.e. no FQDN).&lt;/p&gt;</description>
    </item>
    <item>
      <title>About Me</title>
      <link>https://rmoff.net/about-me/</link>
      <pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/about-me/</guid>
      <description>&lt;h3 id=&#34;whoami&#34;&gt;&lt;code&gt;whoami&lt;/code&gt;&lt;/h3&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;Robin is a Sr. Principal Advisor, Streaming Data Technologies at &lt;a href=&#34;https://confluent.io&#34;&gt;Confluent&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;He has been speaking at conferences since 2009 including QCon, Devoxx, Strata, Kafka Summit, and Øredev.&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;blockquote&gt;&#xA;&lt;p&gt;You can find his &lt;a href=&#34;https://talks.rmoff.net&#34;&gt;talks online&lt;/a&gt;, subscribe to his &lt;a href=&#34;http://youtube.com/rmoff&#34;&gt;YouTube channel&lt;/a&gt;, and read &lt;a href=&#34;http://rmoff.net/&#34;&gt;his blog&lt;/a&gt;. Outside of work, Robin enjoys running, drinking good beer, and eating fried breakfasts—although generally not at the same time.&lt;/p&gt;&#xA;&lt;/blockquote&gt;&#xA;&lt;h2 id=&#34;-bio--speaker-photos&#34;&gt;👉🏻 &lt;a href=&#34;https://noti.st/rmoff/bio&#34;&gt;Bio &amp;amp; speaker photos&lt;/a&gt;&lt;/h2&gt;&#xA;&lt;h3 id=&#34;speaking-experience&#34;&gt;Speaking experience&lt;/h3&gt;&#xA;&lt;p&gt;Speaker since 2009 at conferences including QCon, Devoxx, O&amp;rsquo;Reilly Strata, NDC, USENIX LISA, Kafka Summit, Øredev, O&amp;rsquo;Reilly SACon, Oracle OpenWorld, JavaZone, Big Data LDN, UKOUG, Oracle CODE, PGConf, etc plus numerous meetups.&lt;/p&gt;</description>
    </item>
    <item>
      <title>Search</title>
      <link>https://rmoff.net/search/</link>
      <pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/search/</guid>
      <description></description>
    </item>
    <item>
      <title>Talks</title>
      <link>https://rmoff.net/presentations/</link>
      <pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/presentations/</guid>
      <description>&lt;p&gt;See &lt;a href=&#34;https://rmoff.net/talks&#34;&gt;talks&lt;/a&gt;&lt;/p&gt;</description>
    </item>
    <item>
      <title>Talks</title>
      <link>https://rmoff.net/talks/</link>
      <pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate>
      <guid>https://rmoff.net/talks/</guid>
      <description>&lt;p&gt;You can find all my recent talks on Notist at &lt;a href=&#34;https://talks.rmoff.net/&#34;&gt;https://talks.rmoff.net/&lt;/a&gt;. Older ones are on &lt;a href=&#34;https://talks.rmoff.net/&#34;&gt;Speaker Deck&lt;/a&gt;.&lt;/p&gt;&#xA;&lt;hr&gt;&#xA;&lt;p&gt;I have a variety of talks, around several different aspects of Apache Kafka and related technologies.&lt;/p&gt;&#xA;&lt;ul&gt;&#xA;&lt;li&gt;Kafka 101 / introductory talk : &amp;ldquo;Kafka as a Platform: the Ecosystem from the Ground Up&amp;rdquo;&lt;/li&gt;&#xA;&lt;li&gt;Introduction to ksqlDB and stream processing principles and semantics - with lots of live demos. &amp;ldquo;An introduction to ksqlDB&amp;rdquo;&lt;/li&gt;&#xA;&lt;li&gt;Live demo showing use of Kafka Connect and ksqlDB: &amp;ldquo;Apache Kafka in Action : Let’s Build a Streaming Data Pipeline!&amp;rdquo;&lt;/li&gt;&#xA;&lt;li&gt;Architectural view of Kafka and its implications for data engineering &amp;amp; data warehousing: &amp;ldquo;The Changing Face of ETL: Event-Driven Architectures for Data Engineers&amp;rdquo;&lt;/li&gt;&#xA;&lt;li&gt;Deep-dive on Kafka Connect: &amp;ldquo;From Zero to Hero with Kafka Connect&amp;rdquo;&lt;/li&gt;&#xA;&lt;li&gt;Nitty-gritty examining options of how you get data into Kafka from RDBMS &amp;ldquo;No More Silos: Integrating Databases and Apache Kafka&amp;rdquo;&lt;/li&gt;&#xA;&lt;li&gt;PoC showing how to build streaming ETL/data platform around live data feeds: &amp;ldquo;🚂On Track with Apache Kafka: Building a Streaming Platform solution with Rail Data&amp;rdquo;&lt;/li&gt;&#xA;&lt;li&gt;Example of integrating IoT data over MQTT into Kafka and processing it: &amp;ldquo;Building IoT applications with Kafka and ksqlDB&amp;rdquo;&lt;/li&gt;&#xA;&lt;/ul&gt;&#xA;&lt;p&gt;Detailed abstracts are available on request for these. Recordings and slides for many can be seen at &lt;a href=&#34;https://talks.rmoff.net/&#34;&gt;https://talks.rmoff.net/&lt;/a&gt;&lt;/p&gt;</description>
    </item>
  </channel>
</rss>
