The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Mind Uploading
How Vestas Wind Systems used outsourced machine learning to transform contract management – Diginomica
Posted: June 7, 2017 at 5:18 pm
Vestas wind turbines in Australia
Our diginomica inboxes are awash with machine learning PR pitches. But when I got the chance to talk to Vestas Wind Systems A/S about their lessons with machine learning in action via an outsourcing partner that got flagged, in a good way.
Henrik Stefansen, Senior Director, Global IT Sourcing at Vestas Wind Systems A/S, gave me the inside view. Founded in 1945, this Danish manufacturer and servicer of wind turbines has become a global player in wind energy. Now with turbines in more than 70 countries, Vestas bills itself as the only global energy company dedicated exclusively to wind energy.
Five years ago, Vestas Wind Systems was dealing with the complications of declining government subsidies. The global economy was working its way out of a recession. Higher operational costs combined with sluggish energy demand compelled Vestas to push hard for new efficiencies. Stefansen has been an IT leader at Vestas for sixteen years. In the last four years, hes led a drastic change:
Weve gone from being a fully insourced company on the IT side about four years ago, to today being more or less fully outsourced. So thats been quite a journey.
Managing outsourced processes has brought a learning curve:
Ive come to realize that a lot of the other stuff that we need to be able to handle and the processes you need to have in place to manage an outsourced setup is quite different from when you run everything yourself.
That opens up a chance to improve processes:
Thats where really we got into looking at, How can we optimize and automate some of these processes instead of doing everything manually?
Stefansen handles these operations with an internal team of twenty, and about a dozen externals. 27,000 employees count on his teams IT services. If you cant handle the breeze, dont be in the renewable energy business:
We went through a bit of a dip through the financial crisis around 2011, where we cut the company in half. We had to reduce that much. But we recovered from that, and had a record year last year.
How has the wind energy business from Stefansens early days at Vestas?
When I joined the company, we were still sort of an entrepreneurial startup. Over the last five, seven years its been much more industrialized. Now wind is a competitor, and its a subsidy to all of the known coal and gas sources as well.
Today, I would say, wind is more or less on par with coal and gas, also from a cost perspective. And thats of course what weve been working towards the last many years If you want to sustain a business like this, it has to be comparable on a cost level to the other energy sources out there thats roughly where we are now.
Success brings its complications:
Looking at it from a country or global perspective, theres no doubt that renewable energy is high on the agenda in most countries these days. That makes it a nice place to be in a company like this. But its also a highly regulated environment Theres a lot of restrictions from local governments that we need to also work with to promote this kind of energy.
Stefansens approach to outsourcing has changed also. At first, outsourcing was a tactical decision in response to the economic downturn: We had to reduce head count, we had to reduce cost, and we had to do it fast. As Vestas bounced back, Stefansen decided that outsourcing was their future course but now they approach it more strategically.
Outsourcing makes sense for Vestas on several fronts. It solves the challenge of needing to staff up internal IT in Denmark. Stefansen also likes the flexibility on cost and exposure to new technologies:
We also saw the possibilities of joining forces with some of the big outsourcing vendors out there that have thousands of people. They can bring us those new technologies much faster and better than we could develop it ourselves.
And thats where SirionLabs comes in. Stefansen found the downside of outsourcing was managing the services. Ideally, he could automate a big chunk of contract management, and have it delivered as a service. During his research, Stefansen found SirionLabs. He evaluated a range of providers:
I looked a few of the big ones, including IBM and SAP. They had good capabilities in some of the areas that I needed. but none of them really had the view and connectivity between the different parts of the process that I saw with Sirion.
Stefansen also liked SirionLabs cloud emphasis:
Their software as a service comes pre-configured out of the box, so you dont have to do the on-site installations and set up. Basically, I just ship my contracts to Sirion, They upload them in India, and we are live.
Vestas started working with SironLabs in 2015. They spent the first few months uploading contracts, but that wasnt the biggest change:
Once you start working with a tool like this, there is a set of processes that enables you to get the benefit out of the tool. That was the main part of the implementation to get those implementations within our own organization.
The big surprise wasnt process change; it was the people side.
Thats probably one element that surprised me a little bit how much energy I had to put into my own organization to get my own colleagues to work in these new processes.
What changes did Stefansen see after going live with SirionLabs? One big change: tracking of deliveries and obligations. Sirion pulls all of the outsourcing vendors obligations from their contracts, and puts it into a calendar view for tracking:
All of that is alert-based. Alerts tell us that, This is supposed to be delivered now. Did you receive it, or is it still pending? In the past, we would have missed that, because it would have taken a lot of manual effort to track all of this.
On the IT side, SirionLabs is now handling Vestas four main outsourcing partners, comprising 70-80 percent of all outsourced services. Its really a shift to pro-active way to manage outsourcers, Stefansen has already seen cost reductions:
[Another part] of our cost savings is the invoice reconciliation. Basically, matching invoices to what weve agreed in the contract, and making sure that we are paying them correctly. Thats where we see a lot of the direct cost savings.
The savings arent small: Stefansens first year calculations on the SirionLabs investment: a 300 percent ROI.We talked about the machine learning aspect. Stefansen doesnt need to know the inner workings of Sirions machine learning capabilities to see the value on his side.
SirionLabs applies machine learning to areas Vestas would have struggled to monitor on their own, from incorrect invoicing to avoiding SLA penalties that are invoked when a usage threshold is reached. As the SirionLabs PR team put it to me, SirionLabs uses machine learning to cull through the mind-numbing tedium of contracts to ensure everyone is doing their job.
Looking ahead, Stefansen wants to get his outsourcing partners to use SirionLabs to collaborate and address contractual issues. So far, weve seen good benefits from that, where weve managed to convince our outsourcing partners that this is a good idea.
Today, SirionLabs manages contracts valuing $160 million for Vestas. For Stefansen, better control over back office IT means his team can be more strategic, and less caught up in administrivia:
If I hadnt implemented this I would probably of had to hire say four people to manage these things manually. So it gives me a lot of flexibility from an organizational point of view.
Image credit - Image of Vestas wind turbine in Macarthur, Australia from the Vestas.com web site, model number V112-3.0 MW.
See the original post:
Posted in Mind Uploading
Comments Off on How Vestas Wind Systems used outsourced machine learning to transform contract management – Diginomica
Virender Sehwag posts Sourav Ganguly and Shane Warne’s embarrasing sleeping pictures on social media – Sportswallah
Posted: June 5, 2017 at 7:29 am
Sportswallah | Virender Sehwag posts Sourav Ganguly and Shane Warne's embarrasing sleeping pictures on social media Sportswallah Sehwag, being the notorious mind that he is, snapped them on his mobile phone and wasted no time in uploading these pictures on his social media. The funniest was the caption he wrote: The future is shaped by one's dreams. These legends still don't ... |
Read more from the original source:
Posted in Mind Uploading
Comments Off on Virender Sehwag posts Sourav Ganguly and Shane Warne’s embarrasing sleeping pictures on social media – Sportswallah
Shhhh! Use These 5 WordPress SEO Secrets to Drive Insane Traffic – Small Business Trends
Posted: June 3, 2017 at 12:30 pm
There are more websites than there are people in the United States, by a good margin. The population in the U.S. is around 321 million while the latest web server survey in May 2017 stated that there are about 1.8 billion websites online. Thats a lot of websites out there that you have to compete against more than there are consumers in the U.S.
The increasing number of websites online has made the website competition to be found online even more challenging. In other words, getting visibility online keeps getting tougher as search engines become saturated with websites. Creating a website and hoping for the best is not enough. Smart businesses owners must be on top of the latest SEO and paid advertising trends to beat the competition.
Furthermore, keep mind that your efforts have to be constant. Doing one SEO tweak every blue moon wont yield results. Fortunately, website CRMs like WordPress make SEO easier to manage with user-friendly platforms and SEO plugins. Here youll learn the SEO fundamentals to use for WordPress to jumpstart your efforts. Lets get started!
SEO stands for Search Engine Optimization. It refers to the process and methods to get visibility online from free or organic search results in search engines such as Google, Bing, or Yahoo. As you can see below in my search for jumpsuits on Google, I found paid listings specifically, Google Shopping or PLA adsand organic listings enclosed in green.
If I were searching for a service, I would have found a similar mix of paid and organic listings, but with a different look. In this case, there are no Google Shopping ads, but Google paid search ads.
The middle listings are locations found in Google Maps; they are also not paid. Getting listed on Google Maps or other local directories would be considered local SEO still important, but not the primary focus of this article. Well mainly focus on SEO strategies for eCommerce retailers.
There are on-site and off-site factors that affect SEO. Examples of on-site factors are a websites content, structure, and speed. Some off-site factors that affect SEO are outside links pointing to the site and its social media following and engagement.
Search engines like Google want to provide the best user experience for the searcher; therefore, it uses these and other factors to rank websites. For example, websites that have an organized structure will have higher rankings than websites that dont. This is because more organized structures help users find what they are looking for faster, which leads to better user experience a priority for search engines.
Another thing to keep in mind is that although organic rankings dont require payment to Google, they will still end up costing you money. Whether you decide to hire a search marketing agency or do it yourself, advanced SEO efforts such as link building or writing will require additional paid help. Luckily, all the WordPress SEO tipswell cover in the next section can be done fairly easily.
A permalink is a URL to a specific post. Instead of having a URL with numbers or dates at the end, such as http://www.yoursite.com/1234, the recommended permalink structure is to use more user-friendly URLs, like http://www.yoursite.com/seo-guide. These types of URLs are easier to share and are preferred by search engines.
Using dates, for example, can make posts look outdated (if the date is old), which, in turn, can lead to lower click-through rates. Which URL would you click on: a post with the URL http://www.yoursite.com/12-5-12 or http://www.yoursite.com/seo-guide? Youd probably skip the post from 2012 and opt for the one that says SEO guide.
To get the ideal permalink structure on WordPress, simply go to Settings> Permalink and select Post Name, as you can see in the image below.
If you havent used this structure so far, make sure to redirect old URLs to the new ones to prevent 404 errors. There are online tools and plugins that can make this process easier.
Additionally, you can add the category name before the posts name. This may be a good idea if your categories and post names are short and descriptive. Otherwise, if your URL is too long, it may get cut off, which is not ideal.
Title tags are one of the most important steps in optimizing website pages for WordPress or any other platform. Its the first snippet of content searchers will read about your page, and it will help differentiate your listing from the rest.
Keep in mind that title tags are meant to encourage the user to click on your listingthey cant look like a bundle of nonsense keywords. They should contain certain keywords in a manner that is easy to read. It should contain your focus keyword, your brands name, and some supporting text to provide the user more information about the page.
Also, note that title tag length varies, according to screen display, so make sure your most important keywords are positioned toward the front. Title tags can help you increase your click through rate or CTR and, in turn, increase your organic ranking; so, the more enticing your title can be, the better.
Meta descriptions are located below the listings URL. These are the snippets of information that allow the user to get more insight into the pages content. They can help with click through rates; however, they dont affect Googles ranking anymore. In 2009, Google announced that meta descriptions and meta keywords dont factor in Googles ranking. Although this news has been out for quite some time now, there are still many people who use meta keywords. Dont bother wasting your time.
Note that if you dont use a meta description, it will be automatically generated by the search engine by finding the keyword searched for in your document and automatically choosing information around that. This shows a bolded word or two in the results page. See, below, an example of an automatically generated meta description in red and a manually created description in green:
As you can see, meta descriptions that are created manually look better and are more enticing than automatically created ones.
An XML sitemap shows all the pages in a website and shows relationships of content within the site such as organization, navigation, and labeling. It allows search engines to crawl your site and properly index pages. Having a sitemap wont help you automatically jump in ranking; however, it will help search engines easily crawl your site and find pages faster. Also, it will allow you to keep track of all your pages to make sure there are no broken links and all redirects are properly in place.
The easiest way to create a sitemap on WordPress is using a plugin such as Yoast or Google XML sitemap generator. Using Yoast, youll simply have to enable the XML sitemap functionality. Every time a new page is created, your sitemap will automatically be updated.
Image optimization is often overlooked, but its an important component in your SEO efforts for many reasons. To start, your images need to be the right size and dimensions. If the image is too big, the page will take too long to load and cause a bad user experience, which, in turn, results in lower rankings. The file size is measured in KB or MB, and you can think of it as the weight of the image. The files dimensions are measured in width, height, and pixels.
Regarding image size, full page images should be around 80Kb-100Kb at most. If the image is part of a page, 20Kb-30Kb is fine. Images in full-screen mode can be around 1280px, 1290px, or even wider. Fortunately, when uploading images, WordPress automatically creates three resized images in addition to the original one: large, medium, and thumbnail. Thus, you can select a different size if you need it.
Other useful optimizations are the addition of image alt tags and title tags. Alt tags show when an image is unable to display, and it helps the user know what the image is about. The title tags help search engines know what your images are about to index them properly.
SEO is a must to increase a websites visibility and stay competitive. Smart business owners must follow best practices to optimize all the various offsite and onsite factors that contribute to a healthy SEO.
Although it is meant to increase free traffic, there are advanced SEO techniques that will require some paid help. Allocate a budget for SEO tasks to ensure youre prepared for any future workload. Putting these SEO fundamentals into practice will help you take your WordPress site to the next level.
SEO Photo via Shutterstock
See the original post:
Shhhh! Use These 5 WordPress SEO Secrets to Drive Insane Traffic - Small Business Trends
Posted in Mind Uploading
Comments Off on Shhhh! Use These 5 WordPress SEO Secrets to Drive Insane Traffic – Small Business Trends
Iggy Azalea slams record label in epic Snapchat rant – NEWS.com.au
Posted: June 1, 2017 at 10:39 pm
The Aussie rapper has hit out at her record company over the failure of her new music in her latest Snapchat rant. Courtesy: Iggy Azalea
Iggy Azalea's probably not very popular at her record company right now.
JUST three short years ago, Australian-born rapper Iggy Azalea was one of the hottest names in music.
She had one of the biggest hits of 2014 with the inescapable Fancy, was an in-demand guest star for the likes of Ariana Grande and Britney Spears, and had a platinum-selling debut album under her belt.
Flash forward to 2017, and Azaleas musical fortunes have waned. The three singles shes released so far this year have all failed to set the charts alight her most recent, Switch, debuting at 180 here in Australia. Her oft-delayed second album Digital Distortion was finally due for release at the end of this month, but Azalea now admits even she doesnt know when itll see the light of day.
Azalea in the video for her single Mo Bounce. It did not chart.Source:News Corp Australia
Taking to her Snapchat account, Azalea let rip with a four-minute rant directed squarely at her record company.
Just wanted to tell you a little story about my album pre-order, because I have no idea when its coming out, she told her followers, using a filter to mask her face and voice.
At first, everyone at my label was like June 2nd, June 2nd! So I told everybody that. But then theyre like, No, we changed our mind, she said.
I was like, OK, whats my album date going to be? June 30th, June 30th. So I did a bunch of interviews on TV, and it seemed like all the reporters knew the date June 30th. I was thinking, yay! They finally got their s**t together! Theyre telling everyone a real date that we can look forward to!
But then I went on Jimmy Fallon and I noticed him saying Digital Distortion, later this month. What does that mean? That cant be good. Theyve backed away from saying a real date.
I-G-G-Y. Picture: GettySource:Getty Images
Azaela says she emailed her record company asking why they were no longer providing a concrete release date for the album.
The reponse she got was, in her words, bulls**t: They said they hadnt done the paperwork so a June 30 release is not going to happen.
She sent another, more aggressive, email.
I guess everybody at that record label just decided to ignore what I had to say, so who knows when my albums coming out. It has been Memorial Day weekend, so theyll probably be partying pretty hard for middle-aged white guys...
I know you guys are sick of it; trust me, Im super sick of it. Im really over it at this point.
Azalea even threatened to provide the address of her record company to fans so they could picket the building together.
Iggy Azalea's probably not very popular at her record company right now.Source:YouTube
Ive really been trying to be sweet and super-professional the last couple of months, but ... Im at my wits end, she said.
At around the same time Azalea was uploading these videos, half-a-dozen tracks from Digital Distortion leaked online leading some fans to suggest it was Azelea herself leaking the songs.
In a new Snapchat video with her album online for all to hear, and having deleted her previous rants a giggling Azalea apologised and announced that shed been on pretty strong meds after having oral surgery.
Digital Distortion is not presently available to pre-order on Australian iTunes.
Follow this link:
Iggy Azalea slams record label in epic Snapchat rant - NEWS.com.au
Posted in Mind Uploading
Comments Off on Iggy Azalea slams record label in epic Snapchat rant – NEWS.com.au
Justin Bieber’s Been Searching For ‘MDMA’ On YouTube And There’s Receipts To Prove It – We The Unicorns
Posted: at 10:39 pm
01 June 2017, 11:06
In the words of Melania Trump, what is she thinking?
Justin Bieberappeared to dump a large chunk of his camera roll onto Instagram yesterday, uploading a whopping 26 snaps in just one hour. Froma selfie with his finger up a friends nose (gross, right?) to a shot of his head super imposed onto Halseys body,the YouTube star turned pop icon has been branching out from his usual formula of staged candids, tour snapsand ab shots.
But onein particular seems especially peculiar. The upload in question shows a YouTube search bar with the words MDMA used for thats ecstasy, kids typed out, and an absolutely bizarre videoplaying underneath.
The video, entitled When People Get High As F*ck, starts off with a Jerry Springer clip, which then transitions to an interview with a shirtless hippy. The interviewer asks him, What do you think the meaning of life is? To which our shirtless friend replies, To live in the mystery and to find purpose. MIND BLOWN.
He actually makes some pretty delightful videos, including this wonderfully pure dance number. Matthew, we like the cut of your jib.
We still have no idea why Justin shared the surreal clip. Is he trying to warn his young fans about the dangers of narcotics? Was he high af? Is this all a mysterious performance art? All we know is:
Save
See original here:
Posted in Mind Uploading
Comments Off on Justin Bieber’s Been Searching For ‘MDMA’ On YouTube And There’s Receipts To Prove It – We The Unicorns
HPE and `The Machine’ potentially the next big IT blockbuster, but one helluva gamble – Diginomica
Posted: at 10:39 pm
So HP has now got as far as announcing its first prototype of `The Machine, first talked about towards the back end of last year. The beast is real and, if the numbers surrounding it are to be believed (and who am I to argue) it represents a significant step forward in resources available and performance.
For example, the prototype features 160 TBytes of memory spread across 40 separate nodes connected using photonics links. And as its architecture is designed squarely around in-memory processing models, that means it is all available, all of the time. According to the company, this allows the equivalent of allowing simultaneous work on some 160 million books, or five times the number of books in the US Library of Congress.
But this is only a prototype and these numbers are, in the great scheme of what HPE envisages for The Machine, really only chicken feed. If its dreams come true, we are now staring at an architecture that can easily scale to an Exabyte-scale, single-memory system as it stands. Out into the future the company is already talking mind-boggling numbers: how about 4,096 Yottabytes? (where a Yottabyte equals1024 bytes). That, the company reckons, is the equivalent of 250,000 times the entire digital universe that exists todayin a box.
This is a new class of memory technology based on large, persistent memory pools that can stretch right out to the edge.
That is the basic outline of it given by Andrew Wheeler, the Deputy Director of the HPE Labs team that has developed the architecture and the prototype. The interesting factor here is that HPE has set out to develop an inclusive architecture, rather than an exclusive buy-all-or-nothing approach. So when it comes to working out at the edge, the devices used can be whatever is extant and/or appropriate for the specific task in hand at that point.
The system is based on an enhanced version of Linux, so the ability to run Linux may even be the only requirement made on such devices. So, while the prototype has been built on devices developed by Cavium and based on ARM architectures, this does not mean that everything out at the end needs to be based on that same device.
The premise of our Intelligenrt Edge design is that users will want to do analytics processing as close to where the data is generated. Take an application like video processing; users wont want to be pushing all that data to some central location for processing. That is just not sustainable or cost effective. The question then is just `what is the processor relevant to getting the job done.
So the idea is to do as much processing as close to the point of generation as possible. Ask the local device if someone carrying a red backpack was spotted in a time frame, rather than send all the data to a central location and then process it. It is only the results that are actually important and need uploading. This does create another problem that Wheelers team have been doing a lot of work on. Communicating between the core and the edge does require agents capable of ensuring that instructions are interpreted correctly, that relevant standards are adhered to and returned data is in a form that can be used immediately.
The primary goal however, is to have an analytics space that is sufficiently large to hold both current and historical data at a scale that is currently not possible to achieve and to get real-time results out of it. And because it is in-memory processing, all the latency introduced by taking data from disk to memory, memory to processor, processor to cache, back to processor (and repeat several times) and finally out to memory and then to disk.
The next steps going toward a real product include building up the growing set of hardware and software technologies that can now be engineered as `products and High Performance Computing road maps.
The second step, having moved from simulations to emulators running on SuperDomes, and on to where we are now with this prototype, we now need to select the partners and customers that we want to land actual workloads on to further increase out understanding. This will help us determine what will be the first real instantiation of what we would call `The Machine. I can tell you right now we have a pretty clear line of sight on how it can address problems in High Performance Computing and analytics work.
An obvious target here is SAP and its growing range of HANA-based applications. Wheeler agreed that HP has a long history with running SAP applications, and estimated it currently runs some 70% of all HANA-based applications. He would confirm nothing, of course, but it seems unlikely that SAP, and some of its customers, will fail to make that list of test subjects.
There are still so many questions to be answered about `The Machine, some of which may yet just kill it. For example, when asked about addressing Yottabytes of memory that is simultaneously processing in real time his response is a classic of the scientific milieu.
We have found some operating system issues with this getting to the 160TByte level. But we do have a conceptual handle on what is required to get to the Yottabyte level.
The big question of course, is `when, and while Wheeler was understandably reticent to give any indication, the signs are that the short version of the answer is `not any time soon. This, in turn, raises of areas of speculation, some of quite a serious nature.
For example, while SAP has garnered some reasonable traction with its HANA in-memory processing technology, it is interesting that not too many others have really piled in behind them. This begs the question as to whether the technology is really only good for certain types of brute analytic applications.
That would explain why others, even those playing in the analytics space, are none-too-fussed about following the SAP lead.
Or is it a case that there are times when technologies and use cases coincide. It is not uncommon for early iterations of technologies to appear and then fade away, because the tech itself is not quite ready, or the functional need has not yet developed amongst users. Later, however, the time, the technology advances and the user need can be right. Example? The mobile phone: it was a housebrick you could make and receive telephone calls on. But when it gained a camera and an internet connection, and could slip into a pocket, it became an extension of the `self.
Where is `The Machine on this scale? As a prototype it is difficult to say, and it is even more difficult to suggest when might be a good time for HPE to be ready with a product. Some of the answer will not even be in HPEs hands for it will depend upon how well the legacy technologies hold out. Current commodity processors are really only pumped up versions of the Intel 4004 processor chip introduced in 1971, and the work within a basic systems architecture, first described by John Von Neumann back in 1945.
Fundamentally, current `stuff just about all of it is definably old. But it works, and generally works well. Is it time to replace it? Quite possibly, and it is possible to see much of current tech development work as just trusses, Band Aids and other surgical appliances designed to keep those aged architectures hanging together.
But it is also possible to see just what HPE has riding on the future success if this in-memory architecture. The company has divested itself of its big systems/SI capabilities, as well as much of its middleware/software activities. It seems determined to be a leading technology developer and provider, with a strong emphasis on hardware, to boot. Yet that, inevitably, puts it up against leaner, faster, lower cost competition that might not have the depth of experience and expertise, but will have more daring and greatly reduced risk aversion to HPE.
It is reasonable to suppose, therefore, that `The Machine will not appear as a product before three years have past, more likely five. A lot of tech water will have passed under the bridge in that time, and it is quite possible that one of the small, smart companies will come up with an analytical tech that sits between what is now, and what can be when HPE brings forth. If that is good enough, it might be the death of `The Machine and even HPE.
If not, maybe CIOs need to start thinking, fantasising, about what they might want to achieve if they could analyse anything against any number of other anythings, in real time. Give it five years and it might be available.
For reasons I cannot defend by any other justification than there lies the direction in which my knee doth jerk, I think `The Machine prototype marks the birth of the next big technology blockbuster. But I also think HPE now has a tiger by the tail, and with the departure of so many other businesses which were, while maybe not desperately profitable, potentially resilient alternatives for the company, that tiger may well bite. Now the company seems increasingly exposed as a mainly hardware tech business playing high roller poker with an unknown high-risk tech development as its stake.
Image credit - Images free for commercial use
See more here:
HPE and `The Machine' potentially the next big IT blockbuster, but one helluva gamble - Diginomica
Posted in Mind Uploading
Comments Off on HPE and `The Machine’ potentially the next big IT blockbuster, but one helluva gamble – Diginomica
To kill net neutrality rules, FCC says broadband isn’t telecommunications – Ars Technica
Posted: at 10:39 pm
Getty Images | Paul Taylor
The Federal Communications Commission's plan to gut net neutrality rules and deregulate the Internet service market may hinge on the definition of the word "broadband."
In February 2015, the FCC's then-Democratic leadership led by Chairman Tom Wheeler classified broadband as "telecommunications," superseding the previous treatment of broadband as a less heavily regulated "information service." This was crucial in the rulemaking process because telecommunications providers are regulated as common carriers under Title II of the Communications Act, the authority used by the FCC to impose bans on blocking, throttling, and paid prioritization.
Thus, when the FCC's new Republican majority voted on May 18 to start the process of eliminating the current net neutrality rules, the commissions Notice of Proposed Rulemaking (NPRM) also proposed redefining broadband as an information service once again.
To make sure the net neutrality rollback survives court challenges, newly appointed FCC Chairman Ajit Pai must justify his decision to redefine broadband less than three years after the previous change. He argues that broadband isn't telecommunications because it isn't just a simple pipe to the Internet. Broadband is an information service because ISPs give customers the ability to visit social media websites, post blogs, read newspaper websites, and use search engines to find information, the FCC's new proposal states. Even if the ISPs don't host any of those websites themselves, broadband is still an information service under Pai's definition because Internet access allows consumers to reach those websites.
Telecommunications, as defined by Congress in the Communications Act, transmits information of the user's choosing to and from endpoints specified by the user without making any changes to the user's information.
Pai's claim that broadband isn't telecommunications might not make sense to consumers, who generally use their Internet connections to access websites and online services offered by companies other than their ISPs, as a TechCrunch article recently argued. But courts have granted the FCC wide latitude on how it defines broadband over the years, essentially ruling that the FCC can classify Internet service however it wants.
Yes, there are plenty of instancesin which courts have overturned FCC decisions, including a 2014 case that vacated an earlier attempt to impose neutrality rules. But when it comes to defining broadband as either an information service or telecommunications, judges have allowed FCC decisions to stand as long as the commissiondoes a reasonably good job of justifying itself. That 2014 decision didn't dispute the FCC's authority to impose net neutrality rules or reclassify ISPsrather, judges said the FCC could impose strict versions of net neutrality rules as long as it changed itsclassification of broadband.
Wheeler relied on the court system's deference to FCC decisions on this matter when he successfully fought off a lawsuit filed by ISPs, and Pai is hoping that judges will grant the same courtesy after the FCC changes its mind.
The Communications Act specifically defines telecommunications as the transmission, between or among points specified by the user, of information of the users choosing, without change in the form or content of the information as sent and received. A telecommunications service is the offering of telecommunications for a fee directly to the public.
The 2015 FCC order that turnedISPs into common carriers and imposed net neutrality rules said that the statutory definition of telecommunications applies to broadband, as evidenced by how ISPs market their services to consumers, consumers' expectations from broadband providers, and the way the networks operate.
Getty Images | Ethan Miller
ISPs might also offer information services such as e-mail and online storage, just like any other company that offers services over the Internet. But the FCC in 2015 said that ISPs' information services are separate offerings from broadband. As a result, the Internet plan you buy from an ISP is a regulated common carrier service even though those same providers offer some services that aren't strictly telecommunications.
Pai's argument that broadband isnt telecommunications doesn't hinge on ISPs offering their own e-mail and online storage services. Instead, he says the core broadband offering itself isn't telecommunications.
Landline and mobile voice service are both considered telecommunications by the FCC. But "Internet service providers do not appear to offer 'telecommunications,'" because broadband Internet users do not typically specify the points between and among which information is sent online, Pais NPRM argues. It continues:
Instead, routing decisions are based on the architecture of the network, not on consumers instructions, and consumers are often unaware of where online content is stored. Domain names must be translated into IP addresses (and there is no one-to-one correspondence between the two). Even IP addresses may not specify where information is transmitted to or from because caching servers store and serve popular information to reduce network loads. In short, broadband Internet users are paying for the access to information with no knowledge of the physical location of the server where that information resides. We believe that consumers want and pay for these functionalities that go beyond mere transmissionand that they have come to expect them as part and parcel of broadband Internet access service.
Under this interpretation, the fact that consumers specify which websites they want to visit isnt the same thing as specifying the "points"they want to reach. Broadband users would have to specify the IP addresses and caching servers they want to connect to in order for broadband providers to become the dumb pipe described in the definition of telecommunications.
An information service, by contrast, is defined in the Communications Act as the offering of a capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications." Theinformation service definition "includes electronic publishing, but does not include any use of any such capability for the management, control, or operation of a telecommunications system or the management of a telecommunications service.
The capability part of the definition is key, according to the FCCs new argument, because broadband offers the capability to provide the functions described in the definition of information service. Pais NPRM thus argues that todays broadband services meet the statutes definition of an information service:
We believe that Internet service providers offer the capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications. Whether posting on social media or drafting a blog, a broadband Internet user is able to generate and make available information online. Whether reading a newspapers website or browsing the results from a search engine, a broadband Internet user is able to acquire and retrieve information online. Whether its an address book or a grocery list, a broadband Internet user is able to store and utilize information online. Whether uploading filtered photographs or translating text into a foreign language, a broadband Internet user is able to transform and process information online. In short, broadband Internet access service appears to offer its users the capability to perform each and every one of the functions listed in the definitionand accordingly appears to be an information service by definition.
Go here to see the original:
To kill net neutrality rules, FCC says broadband isn't telecommunications - Ars Technica
Posted in Mind Uploading
Comments Off on To kill net neutrality rules, FCC says broadband isn’t telecommunications – Ars Technica
Eavesdropping on the powerhouse producer and rising star. – Wonderland Magazine
Posted: May 30, 2017 at 2:31 pm
Eavesdropping on the powerhouse producer and rising star.
Tender Central is a musician following an intriguing path. Classically trained on cello and piano, she has since spent time in Ben Howards band, and then collaborated on her own music with dancefloor-orientated producers such as Jakwob, who also released her first single, Wake Me Up, on his label Boom Ting Recordings.
The result of these sessions is intricate yet calming music. If a track has been granted a beat, its fair to say as with Wake Me Up it wont be suffering a lack of invention. Her highly recommendable live performances showcase the wide variety of influences, and lay bare an artist looking to engage the heart and mind, before eventually getting you dancing.
Current single Lava, produced by Kideko, features a gorgeous guest turn from Matthew Hegerty of Matthew and the Atlas, and has been released as part of the Communion Singles Club. We sat down with Tender Central (real name India Bourne) and Jakwob to discuss their collaborations to date.
How were the early recording sessions between you two?
Tender Central: Brilliant.
Jakwob: It was different, because were both very different types of musicians. It worked because all the things I didnt know, you knew, and all the things you didnt know, I knew.
TC: I think we really complement each other in that way, dont we? I really didnt know anything about beats. My background was cello, piano and classical
J: thats what I want my background to be!
TC: Its like you had everything I didnt! So it was really interesting. The way James (Jakwob) hears sound is so cool. I think all the songs we do are a great combination of instrumentation, but also space.
J: I like working with people that are trying to figure something out. They dont know what theyre looking for, but theyre up for an adventure or discovery. I knew that India has so many ideas. There was so much scope for experimentation. But to be honest working with very good musicians is always a bit intimidating!
Any lessons learned from working with each other?
J: Get piano lessons! I think mine has been not throwing the kitchen sink at stuff. In between working with each other I was doing a lot of pop stuff. During the day Id be throwing so much stuff into a pop record and then in the evening maybe working with India, putting the essentials in and really thinking about what youre writing about. Let the topic of the song be reflected in the production.
Do you both have similar influences?
TC: We love the same tracks actually.
J: Theres not much we disagree on. In terms of heritage, were probably completely different. If we go back to how we both came into music, it was probably vastly different tastes.
TC: Playing cello growing up, I had to learn a classical repertoire, and I had to sing a lot of classical repertoire.
J: And then I was buying garage records probably!
TC: So there we go, our influences right at the beginning were probably totally different, which I loved. We were always going to come at it with a slightly different angle, but appreciate the same songs.
How would you describe the music youve made to somebody who hasnt heard it?
TC: Im really bad with the genre thing.
J: I think theres no point in genres anymore. The only reason for genres is for inputting data when youre uploading something to Spotify or Soundcloud. Its all irrelevant nowadays.
TC: [The music is] quite vocal-led. Electronic kind of tribal.
J: You live in the West Country dont you? For me it sounds like Ancient England.
TC: Yeah with a twist.
J: Very British sounding, I think. Youve drawn from so many other countries.
TC: Im influenced by quite a lot by Celtic melodies, but also listen to a lot of Indian music. I love the inflections of the vocals.
Youre playing a bunch of festivals, what can people expect from the shows?
TC: Its really upbeat. Expect to dance. Dancing is my favourite thing, apart from music. I love music that makes me move. Hopefully I take the listener somewhere. Ive so enjoyed researching these songs and figuring out what I really want to say. Hopefully the lyrical content and the instrumentation will be interesting to people. I just want to make people feel good. Expect a bit of a journey, and to dance.
J: You want to hear it at a festival. Its definitely a festival set, for sure.
Photography
Flore Diamant
May 30, 2017
More here:
Eavesdropping on the powerhouse producer and rising star. - Wonderland Magazine
Posted in Mind Uploading
Comments Off on Eavesdropping on the powerhouse producer and rising star. – Wonderland Magazine
Using pilot feedback to refine system: GSTN chairman Navin Kumar – The Indian Express
Posted: May 26, 2017 at 4:05 am
Written by Aanchal Magazine | Updated: May 26, 2017 10:30 am GSTN chairman said the project consists of a front-end and a back-end.
With the ambitious Goods and Services Tax set to be rolled out from July 1, its IT backbone, GSTN, is preparing for all kinds of contingencies. It recently conducted a pilot with over 3,000 taxpayers and with the feedback is trying to make the system user friendly, GSTN Chairman Navin Kumar told The Indian Express in an interview. Excerpts:
The IT preparedness review meeting was held last week. GSTN is scheduled to make a presentation in the next GST Council meeting. What is left?
First, let me say what has been done. GSTN project consists of a front-end and a back-end. Front-end has the GST portal on which three services will run: Registration of new taxpayers, filing of returns by taxpayers and payment of returns by tax. Back-end has IT systems of all the tax departments of the states and the Centre.
When we started, the Centre asked us to make an assessment of the revenues of the states and how their systems work. We found that all states have an IT system for their commercial taxes, but they were in various stages of development. Some were quite sophisticated but some were rudimentary. And, a majority of them were somewhere in between. We told the Centre that in our assessment, not more than 7-8 states will be able to upgrade or bring in a new system for GST in time. So, the Centre convened a meeting in 2014, in which commercial tax officers of all states participated.
There we shared our assessment and said that we would do the front-end and the back-end, if the states opt for it. At that time, 12 states opted for it. We selected Infosys as our partner in November 2015. Between November 2015 and now, 12 more states have joined for their back-end. Now we are doing for 20 states and 7 Union territories, total of 27 states/UTs.
The front-end will consist of IT hardware, software and we will have to provide connectivity and bandwidth. Hardware is ready. Connectivity and bandwidth have been arranged. We have four data centres, which are connected to the data centres of the states as well as CBEC. That work of connectivity with all states has been done. With the CBEC, its going on and it will happen by the end of this month. That leaves with us the software part.
When we selected Infosys, the Constitutional Amendment Bill had not been passed. We were told by the Empowered Committee of State Finance Ministers that Infosys should be asked to start with the software development but keep the hardware part on hold as it can be made after the Bills passage. So, software started in November 2015 and we are in a happy position that software is more or less done now.
Between May 2 and May 16, we conducted a beta test where we invited over 3,000 taxpayers to work on our software and tried to file returns and make tax payments, just as they would do once the GST gets rolled out. In the process, we took their feedback and we are now working on the feedback. We are trying to make it user friendly and simple. When you have a large system, there are problems. All those problems which came up during beta testing, either have been resolved or we are in the process of resolving them.
Could you give some examples of the issues that came up?
For example, the interface, was found to be a bit complicated by some taxpayers. So we said we will make it easier to interact. We have simplified it now. We have given an offline tool, which they can download and they can use that for filing their returns, uploading their invoices. One problem that we encountered was that people with old computers were not compatible with the current (GSTN) software. We had to revise it, so that even older computers can use it.
Some people were not able to log in, some had problems with using their digital signature certificates. Some problems were user specific, so they had to be educated. We also have a help desk. Wherever there was a problem, either in software or the system, we resolved it. Now, we are more confident that the software will be more user friendly. Beyond this, the refinements are still going on. Then, we have to register GST practitioners the consultants, advocates, chartered accountants, who would be advising and helping the taxpayers file their returns. We are going to start their registration after the enrolment window closes.
In the meantime, our IT infrastructure, which is ready, is being tested. Now, we are putting our software on the system and load tests, performance tests are on. They will continue till about the middle of June. Another thing is Standardisation Testing and Quality Certification(STQC), which comes under the Ministry of Electronics and IT. They are supposed to audit all large government IT projects. We have been interacting with them. The software part, the performance test, vulnerability assessment and penetration testing, which is the security part, that will be done now.
Especially in light of the ransomware attack
Yes. They will check if our arrangements are adequate or not. This will also happen in June. Once all these things are done, we will be ready for roll out.
How many GST practitioners would be registered?
Around 4-5 lakh across the country. Once the portal starts, then the taxpayers will be able to see which tax practitioners are available in their area. If they want to consult anybody they can go there. Similarly, tax practitioners will see taxpayers in their area.
What about GST Suvidha Providers?
GST Suvidha Providers are IT companies or accounting software companies. Some of them are providing services to taxpayers for things like accounting package, inventory management, invoicing and other services of VAT and service tax. What we thought was that we are providing our portal where people will come and file their returns, make tax payments, but for large or medium sized companies, the number of invoices may be very large. The new GST law differs from the existing VAT or service tax law in the sense that here the returns have to carry the invoice data, which was not there earlier. That is going to be a crucial part of the whole compliance process.
For small taxpayers, maybe they have to enter 100 or 200 invoices, but if you have invoices running into thousands, that may not be possible on the portal. So we thought of involving firms already providing such services. While our portal is supposed to cater to 80 lakh taxpayers, they can have a portal that caters to maybe 5,000 or 10,000 taxpayers. Supposing that we have 30,40 or hundreds such GSPs (GST Suvidha Providers), then our load will be distributed.
It was said that GSTN can handle 300 crore invoices a month. Will the actual size be tested only post the roll out?
Until now, some states have been taking invoice level data but mostly they were taking it as annexures, and data was not being used or uploaded. Some states like Gujarat, Maharashtra had started it so we took data from them and on that basis we have estimated the total anticipated invoices. But, we will get to know only when GST starts. So far as testing is concerned, we are testing for 700 crore invoices. 320 crore invoices is the estimation and we are testing at double the number. I do not think that it will exceed the number, but we will know the actual position when people start filing.
Are you expecting any disruption in the initial phase?
We have already done the enrolment, so our portal has been launched, it has been used. 60 lakh people have come to the portal, so we have seen how it works. The difference is that in enrolment there was no deadline, in a way. In the first phase, we announced it for two months till December and then we kept extending. The maximum load we saw was about 2 lakh taxpayers on a single day. Here, there will be a deadline for the returns. We have to see how they come and what load is there. We have studied for VAT and we found that almost 50 per cent of the taxpayers file their returns on the last day. We kept that in mind while making assessment of the compute capacity that we must have.
So you are preparing for the possibility that on the last day, the load will shoot up
Yes. But, here what the taxpayers have to do is to first give the invoice data. In our system, we have kept provision for uploading that data on a daily basis. They can do it at the time of issuing invoices. When they are selling something in the shop, they can upload invoice immediately. But, that may not happen immediately. We will now start a media campaign that dont wait till the 10th of every month. At least, they can upload the invoice data at the end of every day or at the end of every week, then they will not face any last moment problems.
It would be easier for big firms, but small taxpayers might faces issues like connectivity
Should not be. Even in worst areas, you are able to connect to the internet for some time, 10-15 minutes at least. Thats why we have given this offline route where you can fill in your data, so that whenever there is connectivity, you can send the data to the portal. That is what we will be telling the taxpayers. In our training of tax officers, we have told them that when they interact with taxpayers, they should advise them not to wait for the last moment. Because this is the most intensive part of the return filing process. Uploading of invoices will test our system because most of the data is coming there. The GST returns will be made by the system based on this data. This is the most difficult part. If people take to uploading the data periodically, then it wont be a problem. But we have designed a system assuming that will not happen and 50 per cent of them will come on the last two days.
Its been seen in existing tax system that if the IT system fails, transporters are left stranded. If the server fails, how will GSTN system cope?
Currently, each state has its own system. Some states have transit permit, some do it manually, while some generate on computer systems. So, wherever it is on IT system, they have to generate from the IT system. If that is down, then its a problem. But the system that we have built, the arrangement is that if something fails, there is something else to take over immediately. That is called business continuity planning (BCP) and is part of our IT system. With Infosys, we have arranged a maximum amount of time allowed for such failures, the time during which it should be corrected. That is why we have four data centres.
One in Delhi, one in Bengaluru, another near Delhi and another near Bengaluru. The reason is that if there is some problem in one data centre, the other one immediately takes over. So, users will not even feel the difference. Suppose theres an earthquake, when everything is down, then it may take longer. But normally the taxpayer will not even feel that there is a switch. And for that, we are also providing connectivity and bandwidth. From our data centre to a state data centre or CBEC data centre, we always have two lines and these are being provided by different service providers, so that if one is down, the other can still work.
GSTN services to Centre and states/UTs have been exempted. Why?
This question will be better answered by the people in the government. My assessment is that this is a service that we are providing to the taxpayers and the government. These are services actually offered by the government and the government has outsourced the work to us. The idea is why do you put a tax on that. In any case, as per the revenue model of GSTN, we will be raising user charge bills and users are taxpayers. We are supposed to charge them, but the government said they will pay on their behalf. That burden will be shared by the Centre and states in proportion of their registered taxpayers. So, if we include an element of tax, then the money will be coming from the government only. So better to keep that out.
For all the latest India News, download Indian Express App now
Go here to see the original:
Using pilot feedback to refine system: GSTN chairman Navin Kumar - The Indian Express
Posted in Mind Uploading
Comments Off on Using pilot feedback to refine system: GSTN chairman Navin Kumar – The Indian Express
3 automated transcription tools for journalists – Journalism.co.uk
Posted: at 4:05 am
You have just returned to the office after a brilliant interview, ready to write your masterpiece, when suddenly the sense of dread hits you it's time to transcribe.
Although often seen as a laborious task, transcribing can be useful to reinforce key points from your interview, especially when you're writing complex stories that need greater attention to detail.
However, if you are short on time and need to publish quickly, it might be worth checking out these transcription tools for journalists.
They require audio files to be as clear as possible, with minimal background noise and distorted speech, but these options are cheaper than using commercial transcription services which are more expensive.
This new tool has recently been developed by two students from Dublin University, and uses the Google Speech API to transcribe audio files in a matter of minutes.
Simply go to the website, enter your email address, upload your audio file and select the language its in. The site will charge you 0.09 cents per minute, which works out as 2.70 (approximately 2.30) for a 30-minute transcription.
Click 'submit' and wait for Scribe to transcribe the file. When the process has been completed, you'll receive a link to your transcription on the Scribe website.
If you want to make sure the transcription is accurate, you can listen back at this point and edit the text, also attaching hyperlinks where you need to.
You'll find that interviews recorded in person transcribe more accurately than those over the phone, and punctuation isn't always 100 per cent correct, but if you're happy to scan through afterwards, it's great for getting the audio on the page.
Trint works in a similar fashion to Scribe, allowing users to upload a file directly to the website, choose the language, and receive the file via an email link.
Before uploading, the tool reminds you to upload clear conversations with little background noise, however it struggles slightly with transcribing English spoken with a stronger accent.
The final product gives you in almost every instance a clear transcription, along with timecodes for individual paragraphs.
Users are offered a free trial to try out the tool, after which you can choose from a range of plans. The options include paying monthly according to how much audio you want transcribed, which costs 13.20 per hour or 100 a month for 10 hours of transcription and there's also a monthly roll-over on unused hours.
Pop Up Archive, created by former journalists Anne Wootton and Bailey Smith, is aimed at podcasters as an online transcription tool that indexes transcripts to make them searchable.
Users can pay from $15 (12) for one hour monthly, all the way up to $300 (231) for 25 hours every month however you should bear in mind that unused hours don't carry over to the next month.
The tool transcribes audio recorded in English in real-time, noticeably slower that Scribe and Trint, but we found the quality very good, with a few punctuation errors in transcriptions of phone interviews, and timestamps matched to the millisecond. Pop Up Archive can also differentiate between voices useful if you have more than one interviewee in the clip.
Once you've uploaded your file, you have the option to add metadata, includin title, format, collection, images, and any relevant tags, if you'd like your audio to be publicly available in the site's archive.
If you like our news and feature articles, you can sign up to receive our free daily (Mon-Fri) email newsletter (mobile friendly).
See the article here:
3 automated transcription tools for journalists - Journalism.co.uk
Posted in Mind Uploading
Comments Off on 3 automated transcription tools for journalists – Journalism.co.uk