The More Immediate Concerns About AI, and Why Steve Pinker is Wrong


, , , , , ,

Steve Pinker opened his two and a half minute video on Big Think (‘Why Alpha Males Fear the Rise of Artificial Intelligence‘) with a couple of very puzzling statements on gender and Greek mythology, which I can only guess was classic pseudo-intellectual bullshitting at work. Following that, the video doesn’t actually contain any reasoned argument against the reservations scientists and engineers have regarding the development of an hypothetical ‘super AI’. In Pinker’s view, a ‘super AI’ wouldn’t have attributes that render it dangerous, and it would have some undefined ‘safeguards’ – circular reasoning, as those safeguards wouldn’t exist unless there were serious considerations of the potential dangers to begin with.

Unlike Sargon of Akkad, I don’t believe an artificial ‘super intelligence’, or even general purpose machine intelligence, is even on the horizon – I’ll explain why this is. However, the intelligent systems technologies we already have come loaded with ethical issues. Today’s intelligent systems are mainly used for ‘big data’ and profiling human behaviour without most people being aware of it, so already it raises questions about control, oversight and its surveillance applications. The facial recognition system on FaceBook, where it’s impossible to prevent friends tagging you in photos, and where it’s maybe impossible to prevent public surveillance camera footage being tied to your online identity, is just one relatively minor example. Whenever we draw cash from an ATM, at least one intelligent system is clocking the amounts being withdrawn, our location and times of day we typically use an ATM, etc. And there was that story about a system that inferred that a Target customer was pregnant before her relatives knew.

Applied AI
There’s a lot more at play than Moore’s Law when predicting how machine intelligence might develop/evolve. The development of machine intelligence has been driven largely by a need to process the increasing volumes of data being generated, mined, stored, and most the powerful systems are used for creating abstractions of that data. In other words, there is an industry demand in the area of ‘big data’, so we can guess the direction in which the technologies might develop.

Another pattern is that machine intelligence is application-specific, either by design or adaptation. Almost all examples are simple reasoning algorithms that are optimised for making rational inferences based on specific data sources. I don’t think there’s a system that does natural language processing and plays chess, and there isn’t a system that could switch between driving a car and flying a passenger jet. And it seems we don’t currently have a way to create a ‘meta intelligence’ capable of transcending a domain – we therefore cannot predict how our hypothetical ‘super intelligent’ system might act, since it must be something very different from today’s systems.

Create Your Own I2P Hidden Service


, , , , , ,

If you’ve installed I2P and run the application, there would be an ‘eepsite’ directory at C:\Users\[user name]\AppData\Roaming\I2P\eepsite\docroot. The first thing to do is modify the files there to create a basic Web site/page. Currently mine has basic HTML pages and some CSS, but no JavaScript, Flash or anything that should be disabled for anonymous browsing.

A look at the router address book reveals how the I2P addresses are resolved. Each peer running a hidden service has a cryptographic identifier/address, which is mapped to a hostname on the .i2p domain. It’s just like IP addresses are mapped to conventional domains.


Registering an Address
In the Hidden Services Manager, there’ll be a section for the local Web server, whether or not it’s active.


Click on ‘I2P webserver‘, and the ‘Edit Server Settings‘ menu will appear. Here there are three important fields: Name, Website name and Local destination.


The Website name is the hostname we want to register for the service, in the form [site name].i2p. The Local destination is a very long Base64 cryptographic identifier – copy and paste this into a text editor.

After naming the site and defining the .i2p address, go to the Master Address Book (‘Addressbook‘ under the I2P Internals panel). Near the foot of this page there are the ‘Add new destination‘ fields. Paste the Base64 Local destination string as the Destination, and enter the desired .i2p hostname as the Host Name.

The site/domain details are now added to the Master Address Book, i.e. the domain name system for I2P, and should eventually be viewable to all peers.


If the site works after entering the .i2p address in the browser, you have a working hidden service.

Finally you might want to publish the domain with no.i2p. I’ve had problems getting this portal to accept my Base64 hash value, but hopefully my entry in the address books will take care of this.


Anonymising Overlay Network


, , ,

Given how easy it is to get started with I2P, I’m surprised it hasn’t got anywhere near the same attention as Tor. Like Tor, I2P is an overlay network that creates a multi-layered outbound tunnel through a series of ‘routers’, and each router along a path can only decrypt one layer. A separate inbound tunnel is created for inbound traffic. In theory, the payload is only accessible to the endpoints. The routing and addressing of I2P peers is also decentralised.
For the users, getting started is a simple matter of downloading and running the client/server software, and waiting for it to build a list of reachable clients.

I used the word ‘pseudonymity’ intentionally, and it has a specific meaning – most of us want an online identity, but it’s not always desirable to associate it with our offline lives. As it stands, with most people using social media on the clear Web, it’s fairly trivial to identify who posted what, since profiles are mostly based around real names and real world identities. If you’re using FaceBook, Google and generally social media, there’s a good chance the service providers have built a fairly detailed profile of you – the town of residence, the sites you browse, the people you hang out with, etc., regardless of how much of that info is public. Censorship and surveillance has never been easier, when almost everything’s in the hands of Google, FaceBook, Twitter, etc. So, there are two issues that result in a gross information asymmetry: the triviality and low cost of mass surveillance, and the centralisation of our means to communicate. I2P is a solution to both of these.

Pseudonymity is about creating a profile and identity, but keeping it separate from our real world, identities. That separation could safeguard freedom of expression by enabling the sharing or challenging of controversial ideas without fear of recrimination. Thankfully this is already possible with I2P, Tor, (carefully selected) VPNs and a little understanding of how to sanitise browser traffic. With Tor and I2P, it is possible to have a domain, site and email address that are separate from the clear Web.

Starting I2P
Although the I2P software is more or less usable straight out of the box (at least for Windows), it works much better after waiting about 30 minutes for the client/server to build its list of peers.

When the I2P application is installed and run, it starts the default Web browser with the GUI loaded. This is where users can restart the proxy’s tunnels, view the network status and access some of the hidden services. Clicking the logo at the top-left will toggle the interface between the hidden services menu and the management console.


Next, the browser must be configured to route traffic through the local I2P proxy. Go to Internet Options or wherever the connection settings are for the browser, and point the proxy settings to on port 4444. If you’ve got multiple proxies installed on a machine for different things, the FoxyProxy extension for Firefox provides an easy way of switching between them.

One of the weaknesses of Tor and I2P is they mask only the IP addresses of the clients, and not much else – this is why they won’t guarantee anonymity on their own. The payload might still include identifying information, and ideally you’d have something for stripping that identifying information before it leaves the local network. It should be possible to chain I2P and Privoxy, in the same way we might do for Tor, in order to strip potentially identifying information from browser traffic. It’s also possible to use Lynx, with the proxy addresses for HTTP and HTTPs configured in lynx.cfg.

As with the .onion services, not all the I2P services are available at a given moment. Some are offline and others take a while to reach.

Firewall Rules
Apparently firewall configuration isn’t necessary, but I added inbound and outbound rules anyway, just to see whether it made a difference. This can be done by opening the Windows Firewall GUI, and selecting ‘New Rule…’. Here we want to create a rule for a program (‘Rule that controls connections for a program’).


Find the I2Psvc.exe file in Windows Explorer (C:\Program Files\i2p\) and paste its location into the New Inbound Rule window.


Proceed with the default options and do the same for the outbound rule.

Of course, an anonymous email account is a necessity, and we have two options: either register a clearweb account over a VPN connection or use an I2P email service. Here I’ve registered a test account with mail.i2p, which is administrated by Postman HQ (hq.postman.i2p).
The downside with this is that mail can only be routed through the I2P network, not from a clearweb mail account to the mail.i2p server or vice versa.

Personal Site
A hidden service on the I2P domain is referred to as an ‘eepsite’. Each user can create their own site/service using the local proxy as a Web Server. As I’ve mentioned, this is likely how social media profiles and groups would be hosted in a few decades.
The server directory is found at C:\Users\[user name]\AppData\Roaming\I2P\eepsite.


To put this online, click the Hidden Services Manager link under the I2P Internals section, and start the I2P Web server.


After the Web server is fully operational, which takes a few minutes, the local I2P address is found in the Hidden Services Manager, under Edit Server Settings. Changes to this address should eventually propagate through the I2P addressing system.


Scary Detector Vans


, , , ,

The other day, The Telegraph made the bold claim that ‘the BBC is to spy on internet users in their homes by deploying a new generation of Wi-Fi detection vans to identify those illicitly watching its programmes online‘. Try as I might to determine how that would work, the only possible source for this disclosure I could find was a report that was published by the National Audit Office (dated March 2016).

As it turned out, The Telegraph must have jumped to this conclusion after the BBC closed a legal loophole that allowed the watching of iPlayer without a TV licence, and after seeing the following statement within the Audit Office report: ‘TVL detection vans can identify viewing on a non-TV device in the same way that they can detect viewing in a television set‘.
The only way that could be true is if the detection equipment was comparing light radiation from a TV/monitor with a live broadcast. There’s nothing in the report about WiFi signals. In fact, the report itself implies that, whatever ‘evidence’ they’re gathering inside the ‘TVL detection vans’, it’s actually less reliable than the contemporaneous notes made during a physical inspection of suspected license evaders’ properties.

Let’s suppose, hypothetically, the BBC did comission a TV Licensing authority to go around monitoring peoples’ WiFi traffic. From a technical perspective, the people manning the BBC detection vans would be no different from the criminal parked outside with a laptop running a packet sniffing tool. This is true regardless of what the law says, simply because the technology cannot distinguish between adversaries.

Thankfully the basic security on the typical WiFi router provides a good level of protection against such an adversary. Anyone could operate their network interface in ‘monitor mode’ and passively capture encrypted packets/frames from multiple nearby networks, but that wouldn’t reveal much about what’s being communicated on the networks unless they were later decrypted somehow.
Technically it’s possible for the iPlayer server to send beacon or signature packets, maybe of a specific size at a specific frequency, then listen for that signature in WiFi traffic (edited to add: Dr. Miguel Rio also suggested something like this in the Telegraph article)- you could get that running a capture in monitor mode, just about, assuming the owner hasn’t changed the MTU trying to fix router/ISP synching problems. The problem is that alone is nowhere near enough evidence to prosecute someone.

So, in conclusion I think it’s another scare story.

Untangling Some MVC 5 Problems


, , , , ,

The following are fixes to some problems I encountered while building a Messaging Dashboard application using MVC 5 and Entity Framework 6.

Message Queue Doesn’t Appear in Entity Framework Model
Unlike a conventional database table, a message queue in the Service Broker part of the database cannot be imported into the Entity Framework model, and Entity Framework can’t read the queue through a stored procedure either (at least not without going through a Web Service).
As a workaround, in the SQL Server Management Studio, create a view for whichever message queue in the Service Broker. Next, import that view into the Entity Framework model for the application.

Error 404: Cannot Find Resource
In my case the problem was related to routing and the startup order of the application components. To resolve this, open the project’s properties and enter the Web section. Clear the Specific Page text box, and check the ‘Current Page‘ radio button.


After doing this, the application should be able to execute in debug mode, either by right-clicking whichever .cshtml file and selecting ‘Set As Start Page‘ or ‘View in Browser‘.

If the same problem persists, run the IIS Manager and ensure the application has an entry under Default Sites while it’s executing. In the Browse for Directory, find and select the location where the Visual Studio project is stored:


Back in the Visual Studio project’s properties, within the Servers section, click the Create Virtual Directory button, to load the virtual directory into IIS.


CS1061: Does Not Contain Definition and no Extension Method


As far as I can determine, the problem is a mismatch somewhere between the Model, Controller and View layers when referencing the object, and I think this happened because Visual Studio didn’t build the Controller-View ‘scaffold’ properly.
Scrap the Controller and Views for the database table/queue, and leave the Model in place. Next, create a new Controller, this time by selecting ‘Add‘ – ‘Controller…‘ instead of ‘New Scaffolded Item…‘, and as before, select the ‘MVC 5 Controller with with views, using Entity Framework‘.


This time, double-check the ‘Reference script libraries‘ option is not selected, and leave the ‘Use a layout page‘ field is blank.


Get every new post delivered to your Inbox.

Join 34 other followers