<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Blog &#8211; B|KM &#8211; B2B SaaS</title>
	<atom:link href="https://bkmsoftware.com/category/blog/feed" rel="self" type="application/rss+xml" />
	<link>https://bkmsoftware.com</link>
	<description></description>
	<lastBuildDate>Mon, 01 Mar 2021 13:34:48 +0000</lastBuildDate>
	<language>es</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>BPM is better</title>
		<link>https://bkmsoftware.com/bpmgdpr</link>
		
		<dc:creator><![CDATA[Lorenz Baermann]]></dc:creator>
		<pubDate>Mon, 01 Mar 2021 13:34:48 +0000</pubDate>
				<category><![CDATA[Blog]]></category>
		<category><![CDATA[bpm]]></category>
		<category><![CDATA[gdpr]]></category>
		<category><![CDATA[tools]]></category>
		<guid isPermaLink="false">https://info.bkmsaas.net/?p=238</guid>

					<description><![CDATA[Why Business Process Management is a better choice than only process mapping in GDPR tools Companies and organizations are experiencing the first stage of a new digital support: GDPR management tools. We analyzed some of them. The problem In some cases the approach of the&#8230;]]></description>
										<content:encoded><![CDATA[
<h2 class="wp-block-heading">Why Business Process Management is a better choice than only process mapping in GDPR tools</h2>



<p>Companies and organizations are experiencing the first stage of a new digital support: GDPR management tools. We analyzed some of them.</p>



<p></p>



<p><strong>The problem</strong></p>



<p>In some cases the approach of the solution is technological -systems designed as if they were independent or of static nature- while in other cases it’s functional, thus technical in compliance matters, still specific.</p>



<p>We classify both approaches as mainly marketing-oriented; not in order to criticize the quality of these tools as such but the fact that the solutions primarily are momentum-driven commercial opportunities for a sudden demand, which market is still not well versed on the subject. <em>This practice raises issues, indeed</em>.</p>



<p>Talking with GDPR experts it emerges that some entrepreneurs and executives have taken a vision which limits GDPR compliance to – a bureaucratic – document management or, even worse, they seem a one-shot maintenance-free operation. All despite the many and repeated warnings and risks of running into huge administrative fines.</p>



<p>Moreover, we have been confided that companies apparently prefer a non-matching real-world business processes above the presenting of&nbsp; ‘official processes’ and carry on with their usual ones. The bottom line: the risk and the purpose of the compliance audit is dispelled although time and money is expended, and at a high risk cost at the same time.</p>



<p><strong>Back to the past</strong></p>



<p>We note a remarkable parallel to the 90’s when ISO quality certification was fashionable. It was not uncommon to find entrepreneurs chasing contingently after a series of certificates, <em>however without any serious intention to change their company culture</em>.</p>



<p>We have worked with quite a few of them at that time and, unfortunately but not by chance, none of them had enlighten their future after such choices. (None of them exist anymore in the market, but this is just a personal account.)</p>



<p>Three decades later quality at large -finally- seems widespread in many business environments, and process mapping &amp; re-engineering is nothing new anymore. The resulting benefits are acknowledged as part of our business culture.</p>



<p><strong>An innovative approach – a golden opportunity</strong></p>



<p>Underestimating the interventions required to meet the GDPR or not taking advantage of all actions needed during this process, may lead companies to choose wrong tools that require serious compliancy efforts. Often this road also leads to the impossibility to become connected with other fundamental areas of competence such as Legal and Operations. Given all of the above, we raise a crucial question:</p>



<p><strong><em>Why should companies and organizations re-map their processes only for GDPR purposes?</em></strong><strong> <em>Why do GDPR tools not start from managed processes? </em></strong></p>



<p>Exchange standards are available, such as IDEFx, FFBD or BPMN 2.0 for modeling or universal standards like XML or Json, just to provide some examples. Then, how common it is actually the adoption of process mapping tools?</p>



<p>This lack of integration of best practices and previous investments leads to a costly attrition.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Il BPM è meglio</title>
		<link>https://bkmsoftware.com/process-mapping-attrition-in-gdpr-tools-or-bpm-opportunity%ef%bb%bf-2</link>
		
		<dc:creator><![CDATA[Lorenz Baermann]]></dc:creator>
		<pubDate>Sun, 14 Feb 2021 15:31:55 +0000</pubDate>
				<category><![CDATA[Blog]]></category>
		<category><![CDATA[cumplimiento]]></category>
		<category><![CDATA[gdpr]]></category>
		<category><![CDATA[Process Management]]></category>
		<category><![CDATA[Process Managementcompliance]]></category>
		<guid isPermaLink="false">https://info.bkmsaas.net/gen/process-mapping-attrition-in-gdpr-tools-or-bpm-opportunity%ef%bb%bf-2/</guid>

					<description><![CDATA[Perché il Business Process Management è meglio della sola mappatura ai fini del GDPR Aziende e organizzazioni stanno vivendo la prima fase di un nuovo supporto digitale: gli strumenti di gestione del GDPR. Ne abbiamo analizzati alcuni. Come per tutti i casi precedenti di nuovi&#8230;]]></description>
										<content:encoded><![CDATA[
<h2 class="wp-block-heading">Perché il Business Process Management è meglio della sola mappatura ai fini del GDPR</h2>



<p id="tw-target-text">Aziende e organizzazioni stanno vivendo la prima fase di un nuovo supporto digitale: gli strumenti di gestione del GDPR. Ne abbiamo analizzati alcuni. </p>



<p id="tw-target-text">Come per tutti i casi precedenti di nuovi processi di conformità aziendale, oggi c&#8217;è un numero crescente di strumenti sul mercato che affrontano la nuovissima legge europea sulla privacy, il Regolamento generale sulla protezione dei dati, entrato in vigore il 25 maggio 2018. Il nostro principale conclusione:  <em>questi strumenti per la privacy hanno limitazioni di progettazione</em>. </p>



<h3 class="wp-block-heading">Il problema</h3>



<p>In alcuni casi l&#8217;approccio della soluzione è tecnologico -sistemi progettati come se fossero indipendenti o di natura statica- mentre in altri casi è funzionale, quindi tecnico in materia di compliance, ancora specifico.</p>



<p>Classifichiamo entrambi gli approcci come principalmente orientati al marketing; non per criticare la qualità di questi strumenti in quanto tali, ma il fatto che le soluzioni sono principalmente opportunità commerciali guidate dallo slancio per una domanda improvvisa, il cui mercato non è ancora esperto in materia. Questa pratica solleva problemi, anzi.</p>



<p>Parlando con gli esperti di GDPR emerge che alcuni imprenditori e dirigenti hanno adottato una visione che limita la conformità al GDPR a una gestione &#8211; burocratica &#8211; dei documenti o, peggio ancora, sembrano un&#8217;operazione one-shot che non richiede manutenzione. Il tutto nonostante i tanti e ripetuti avvertimenti e rischi di incorrere in enormi sanzioni amministrative.</p>



<p>Inoltre, ci è stato confidato che le aziende apparentemente preferiscono processi di business del mondo reale non corrispondenti rispetto alla presentazione di &#8220;processi ufficiali&#8221; e continuano con quelli abituali. Conclusione: il rischio e lo scopo dell&#8217;audit di conformità vengono dissipati nonostante si spenda tempo e denaro e allo stesso tempo con un costo di rischio elevato.</p>



<h3 class="wp-block-heading">Ritorno al passato</h3>



<p>Notiamo un notevole parallelo con gli anni &#8217;90, quando la certificazione di qualità ISO era di moda. Non era raro trovare imprenditori che inseguivano in modo contingente una serie di certificati, senza tuttavia alcuna seria intenzione di cambiare la loro cultura aziendale.</p>



<p>Abbiamo lavorato con un bel po &#8216;di loro in quel momento e, purtroppo ma non a caso, nessuno di loro aveva illuminato il proprio futuro dopo tali scelte. (Nessuno di loro esiste più sul mercato, ma questo è solo un account personale.)</p>



<p>Tre decenni dopo, la qualità in generale, infine, sembra diffusa in molti ambienti aziendali e la mappatura e la reingegnerizzazione dei processi non sono più una novità. I vantaggi che ne derivano sono riconosciuti come parte della nostra cultura aziendale.</p>



<h3 class="wp-block-heading">Un approccio innovativo: un&#8217;opportunità</h3>



<p>Sottovalutare gli interventi necessari per soddisfare il GDPR o non sfruttare tutte le azioni necessarie durante questo processo, può portare le aziende a scegliere strumenti sbagliati che richiedono un serio impegno di conformità. Spesso questa strada porta anche all&#8217;impossibilità di collegarsi ad altre aree di competenza fondamentali come Legale e Operativo. Considerato tutto quanto sopra, solleviamo una domanda cruciale:</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow"><p><em>Perché le aziende e le organizzazioni dovrebbero mappare i propri processi solo ai fini del GDPR? Perché gli strumenti GDPR non partono dai processi gestiti?</em></p></blockquote>



<p>Sono disponibili standard di scambio, come IDEFx, FFBD o BPMN 2.0 per la modellazione o standard universali come XML o Json, solo per fornire alcuni esempi. Allora, quanto è comune l&#8217;adozione di strumenti di mappatura dei processi?</p>



<p>Questa mancanza di integrazione delle migliori pratiche e degli investimenti precedenti porta a un costoso logoramento.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>GDPR integration in contracts management</title>
		<link>https://bkmsoftware.com/gdpr-integration-in-contracts-management-opportunity-for-a-better-sensitive-data-management-and-compliance</link>
		
		<dc:creator><![CDATA[Lorenz Baermann]]></dc:creator>
		<pubDate>Mon, 01 Jul 2019 16:15:17 +0000</pubDate>
				<category><![CDATA[Blog]]></category>
		<category><![CDATA[article28]]></category>
		<category><![CDATA[clm]]></category>
		<category><![CDATA[gdpr]]></category>
		<category><![CDATA[gdprarticle28]]></category>
		<guid isPermaLink="false">https://info.bkmsaas.net/gen/gdpr-integration-in-contracts-management-opportunity-for-a-better-sensitive-data-management-and-compliance/</guid>

					<description><![CDATA[Contract Management tools and CLM (Contract Lifecycle Management) practices offer the opportunity to integrate managed processes from the very beginning of the data stream: the contracts. Article 28 of GDPR provides some guidelines that we develop in this paper. ]]></description>
										<content:encoded><![CDATA[
<h2 class="wp-block-heading">An opportunity for a better sensitive data management and compliance</h2>



<p>Contract Management tools and CLM (Contract Lifecycle Management) practices offer the opportunity to integrate managed processes from the very beginning of the data stream: the contracts. Article 28 of GDPR provides some guidelines that we develop in this paper. </p>



<p><strong>Contracts
and GDPR</strong></p>



<p>Organizations
can almost easily identify the source of sensitive data in their contracts,
either because contracts <em>de facto</em> represent the data collecting events (B2C
and B2B) or because data treatment or manipulation is the <em>subject</em> of
contracts themselves (B2B). This latter is the case of third parties involved in
data manipulation or data treatment, the so-called “processors” by article 28
of the GDPR. Relationship with these parties is regulated by contracts. </p>



<p><strong>Article
28 </strong></p>



<p>EU
general data protection regulation 2016/679 (GDPR), in effect since 25 May 2018,
states in Article 28 that </p>



<p>“…<em>the controller shall use only processors providing <strong>sufficient
guarantees</strong> to implement appropriate technical and organizational measures
in such a manner that processing will meet the requirements of this Regulation
and ensure the protection of the rights of the data subject.</em>”&nbsp; </p>



<p>Wording
in bold characters in the above quoted text is not our personal typographic
choice. The impact of this article has not been overviewed by Brussels yet but
the concept is crystal clear: processor’s responsibility goes beyond his own organization;
it extends to <em>the whole</em> business network it relies on. This also affects
foreign companies and organizations that treat EU citizen’s data.</p>



<p>When
dealing with sensitive data the governance of relations with <em>processors</em>
by contracts is not a common-sense or best practice anymore but an <em>obligation</em>
as dictates Article 28, paragraph nr. 3:</p>



<p><em>“Processing by a processor shall be governed by a
contract or other legal act under Union or Member State law, that is binding on
the processor with regard to the controller […]”</em></p>



<p>From
a practical point of view, organizations should develop governance procedures
for managing the sensitive data chain and all relations with processors
assuring their compliance. This is where CLM can help.</p>



<p><strong>How
can CLM help?</strong></p>



<p>CLM’s
basic principle is taking <em>full control</em> of the contract lifecycle and all
contract related aspects impacting organizational issues. This means that by using
CLM practices companies have the ability to control and manage direct relations
between business processes and contracts, considering the latter as sources. </p>



<p>It
is a fact that Legal Audit is a fast and precise operation when CLM tools are
adopted. The same cannot be said about traditional or manual legal management:
in one of our customers the Legal Audit process was reduced, after adopting
CLM, from 2 or 3 days to 30 minutes.</p>



<p>All
the above can be translated into the following general actions:</p>



<ol class="wp-block-list"><li>Identifying
specific contracts and contract categories that represent sources of sensitive
data.</li><li>Identifying
IT and service contracts with third-parties and contractors related to a).</li><li>Collecting
contracts in b) for auditing the GDPR required guarantees and compliancy of the
involved parties for the whole data stream.</li><li>Integrating
CLM with Business Process Management and its link to the GDPR process
management: data treatment audit items should be <em>identified</em> with their
legal sources in order to guarantee their management and enhance all following
process maintenance.</li><li>Evaluating
the opportunity of sharing the same tools as common language between controller
and processor.</li><li>Managing GDPR processes (audit and maintenance) using the
legal perspective as a starting point.</li></ol>



<p><strong>Conclusions</strong></p>



<p>Organizations need support regarding EU sensitive data manipulation compliance; complex activities must be managed involving IT and service contracts review. Contract Lifecycle Management tools help organizations in the tedious task of identifying and collecting their processors for a correct GDPR risk management.</p>



<hr class="wp-block-separator"/>



<p>This article is also available in LinkedIn, in pdf format, <a href="https://www.linkedin.com/feed/update/urn:li:activity:6551485925355065344" target="_blank" rel="noreferrer noopener" aria-label="here (opens in a new tab)">here</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Why Business Process Management is a better choice than only process mapping in GDPR tools﻿</title>
		<link>https://bkmsoftware.com/process-mapping-attrition-in-gdpr-tools-or-bpm-opportunity%ef%bb%bf</link>
		
		<dc:creator><![CDATA[Lorenz Baermann]]></dc:creator>
		<pubDate>Wed, 26 Jun 2019 14:29:21 +0000</pubDate>
				<category><![CDATA[Blog]]></category>
		<category><![CDATA[cumplimiento]]></category>
		<category><![CDATA[gdpr]]></category>
		<category><![CDATA[Process Management]]></category>
		<category><![CDATA[Process Managementcompliance]]></category>
		<guid isPermaLink="false">https://info.bkmsaas.net/gen/process-mapping-attrition-in-gdpr-tools-or-bpm-opportunity%ef%bb%bf/</guid>

					<description><![CDATA[Companies and organizations are experiencing the first stage of a new digital support: GDPR management tools. We analyzed some of them. As for all previous cases of new business compliance processes there is today a growing number of tools in the market addressing the all&#8230;]]></description>
										<content:encoded><![CDATA[
<p>Companies
and organizations are experiencing the first stage of a new digital support:
GDPR management tools. We analyzed some of them.</p>



<p>As
for all previous cases of new business compliance processes there is today a
growing number of tools in the market addressing the all new European privacy
law, the General Data Protection Regulation, which came into force on May 25,
2018. Our main conclusion: <em>these privacy
tools have design limitations</em>.</p>



<p><strong>The problem</strong><strong></strong></p>



<p>In
some cases the approach of the solution is technological -systems designed as
if they were independent or of static nature- while in other cases it’s
functional, thus technical in compliance matters, still specific. </p>



<p>We
classify both approaches as mainly marketing-oriented; not in order to
criticize the quality of these tools as such but the fact that the solutions
primarily are momentum-driven commercial opportunities for a sudden demand,
which market is still not well versed on the subject. <em>This practice raises issues, indeed</em>.</p>



<p>Talking
with GDPR experts it emerges that some entrepreneurs and executives have taken
a vision which limits GDPR compliance to – a bureaucratic – document management
or, even worse, they seem a one-shot maintenance-free operation. All despite
the many and repeated warnings and risks of running into huge administrative
fines. </p>



<p>Moreover,
we have been confided that companies apparently prefer a non-matching real-world
business processes above the presenting of&nbsp;
‘official processes’ and carry on with their usual ones. The bottom
line: the risk and the purpose of the compliance audit is dispelled although
time and money is expended, and at a high risk cost at the same time.</p>



<p><strong>Back to the past</strong><strong></strong></p>



<p>We
note a remarkable parallel to the 90’s when ISO quality certification was
fashionable. It was not uncommon to find entrepreneurs chasing contingently
after a series of certificates, <em>however
without any serious intention to change their company culture</em>. </p>



<p>We
have worked with quite a few of them at that time and, unfortunately but not by
chance, none of them had enlighten their future after such choices. (None of
them exist anymore in the market, but this is just a personal account.) </p>



<p>Three
decades later quality at large -finally- seems widespread in many business
environments, and process mapping &amp; re-engineering is nothing new anymore.
The resulting benefits are acknowledged as part of our business culture. </p>



<p><strong>An innovative approach – a golden opportunity</strong><strong></strong></p>



<p>Underestimating
the interventions required to meet the GDPR or not taking advantage of all
actions needed during this process, may lead companies to choose wrong tools
that require serious compliancy efforts. Often this road also leads to the
impossibility to become connected with other fundamental areas of competence
such as Legal and Operations. Given all of the above, we raise a crucial
question: </p>



<p><strong><em>Why
should companies and organizations re-map their processes only for GDPR
purposes?</em></strong><strong>
<em>Why do GDPR tools not start from managed
processes? </em></strong><strong><em></em></strong></p>



<p>Exchange
standards are available, such as IDEFx, FFBD or BPMN 2.0 for modeling or
universal standards like XML or Json, just to provide some examples. Then, how
common it is actually the adoption of process mapping tools?</p>



<p>This
lack of integration of best practices and previous investments leads to a
costly attrition.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>What is fog computing? Connecting the cloud to things</title>
		<link>https://bkmsoftware.com/what-is-fog-computing</link>
		
		<dc:creator><![CDATA[Lorenz Baermann]]></dc:creator>
		<pubDate>Thu, 18 Jan 2018 15:56:30 +0000</pubDate>
				<category><![CDATA[Blog]]></category>
		<category><![CDATA[edge computing]]></category>
		<category><![CDATA[Fog computing]]></category>
		<category><![CDATA[Fog computingedge computing]]></category>
		<guid isPermaLink="false">https://info.bkmsaas.net/gen/what-is-fog-computing/</guid>

					<description><![CDATA[Fog computing extends the concept of cloud computing to the network edge, making it ideal for internet of things (IoT) and other applications that require real-time interactions.]]></description>
										<content:encoded><![CDATA[<section class="deck viewability">
<h3>Fog computing extends the concept of cloud computing to the network edge, making it ideal for internet of things (IoT) and other applications that require real-time interactions.</h3>
<p class="dateline"><span class="by">Published on&nbsp;<a href="https://www.networkworld.com/article/3243111/internet-of-things/what-is-fog-computing-connecting-the-cloud-to-things.html" target="_blank" rel="noopener"><span class="publisher">NetworkWorld</span></a> By </span><span class="divider break"><a href="https://www.networkworld.com/author/Brandon-Butler/" rel="author">Brandon Butler</a>&nbsp;</span> <span class="pub-date">Jan 17, 2018</span></p>
<p>Fog computing is the concept of a network fabric that stretches from the outer edges of where data is created to where it will eventually be stored, whether that&#8217;s in the cloud or in a customer’s data center.</p>
<p>Fog is another layer of a distributed network environment and is closely associated with cloud computing and the internet of things (IoT). Public infrastructure as a service (IaaS) cloud vendors can be thought of as a high-level, global endpoint for data; the edge of the network is where data from IoT devices is created.</p>
<p>Fog computing is the idea of a distributed network that connects these two environments. “Fog provides the missing link for what data needs to be pushed to the cloud, and what can be analyzed locally, at the edge,” explains Mung Chiang, dean of Purdue University’s College of Engineering and one of the nation’s top researchers on fog and edge computing.</p>
<aside id="" class="nativo-promo nativo-promo-1 smartphone"></aside>
<p>According to the <a href="https://www.openfogconsortium.org/" target="_blank" rel="nofollow noopener">OpenFog Consortium</a>, a group of vendors and research organizations advocating for the advancement of standards in this technology, fog computing is “a system-level horizontal architecture that distributes resources and services of computing, storage, control and networking anywhere along the continuum from Cloud to Things.”</p>
<h2><strong>Benefits of fog computing </strong></h2>
<p>Fundamentally, the development of fog computing frameworks gives organizations more choices for processing data wherever it is most appropriate to do so. For some applications, data may need to be processed as quickly as possible – for example, in a manufacturing use case where connected machines need to be able to respond to an incident as soon as possible.</p>
<p>Fog computing can create low-latency network connections between devices and analytics endpoints. This architecture in turn reduces the amount of bandwidth needed compared to if that data had to be sent all the way back to a data center or cloud for processing. It can also be used in scenarios where there is no bandwidth connection to send data, so it must be processed close to where it is created. As an added benefit, users can place security features in a fog network, from segmented network traffic to virtual firewalls to protect it.</p>
<h2><strong>Applications of fog computing </strong></h2>
<p>Fog computing is the nascent stages of being rolled out in formal deployments, but there are a variety of use cases that have been identified as potential ideal scenarios for fog computing.</p>
<p><em>Connected Cars:</em> The advent of semi-autonomous and self-driving cars will only increase the already large amount of data vehicles create. Having cars operate independently requires a capability to locally analyze certain data in real-time, such as surroundings, driving conditions and directions. Other data may need to be sent back to a manufacturer to help improve vehicle maintenance or track vehicle usage. A fog computing environment would enable communications for all of these data sources both at the edge (in the car), and to its end point (the manufacturer).</p>
<p><em>Smart cities and smart grids</em> Like connected cars, utility systems are increasingly using real-time data to more efficiently run systems. Sometimes this data is in remote areas, so processing close to where its created is essential. Other times the data needs to be aggregated from a large number of sensors. Fog computing architectures could be devised to solve both of these issues.</p>
<p><em>Real-time analytics</em> A host of use cases call for real-time analytics. From manufacturing systems that need to be able to react to events as they happen, to financial institutions that use real-time data to inform trading decisions or monitor for fraud. Fog computing deployments can help facilitate the transfer of data between where its created and a variety of places where it needs to go.</p>
<h2><strong>Fog computing and 5G mobile computing </strong></h2>
<p>Some experts believe the expected roll out of 5G mobile connections in 2018 and beyond could create more opportunity for fog computing. “5G technology in some cases requires very dense antenna deployments,” explains Andrew Duggan, senior vice president of technology planning and network architecture at CenturyLink. In some circumstances antennas need to be less than 20 kilometers from one another. In a use case like this, a fog computing architecture could be created among these stations that includes a centralized controller that manages applications running on this 5G network, and handles connections to back-end data centers or clouds.</p>
<h2><strong>How does fog computing work? </strong></h2>
<p>A fog computing fabric can have a variety of components and functions. It could include fog computing gateways that accept data IoT devices have collected. It could include a variety of wired and wireless granular collection endpoints, including ruggedized routers and switching equipment. Other aspects could include customer premise equipment (CPE) and gateways to access edge nodes. Higher up the stack fog computing architectures would also touch core networks and routers and eventually global cloud services and servers.</p>
<p>The OpenFog Consortium, the group developing reference architectures, has outlined three goals for developing a fog framework. Fog environments should be horizontally scalable, meaning it will support multiple industry vertical use cases; be able to work across the cloud to things continuum; and be a system-level technology, that extends from things, over network edges, through to the cloud and across various network protocols. (See video below for more on fog computing from the OpenFog Consortium.)</p>
<h2><strong>Are fog computing and edge computing the same thing? </strong></h2>
<p>Helder Antunes, senior director of corporate strategic innovation at Cisco and a member of the OpenFog Consortium, says that edge computing is a component, or a subset of fog computing. Think of fog computing as the way data is processed from where it is created to where it will be stored. Edge computing refers just to data being processed close to where it is created. Fog computing encapsulates not just that edge processing, but also the network connections needed to bring that data from the edge to its end point.</p>
<p><strong>[ Related (NetworkWorld): <a href="https://www.networkworld.com/article/3224893/internet-of-things/what-is-edge-computing-and-how-it-s-changing-the-network.html" target="_blank" rel="noopener">What is edge computing and how it’s changing the network</a> ]</strong></p>
</section>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>The top 5 user requirements of IoT edge platforms</title>
		<link>https://bkmsoftware.com/top5-ur-iot</link>
		
		<dc:creator><![CDATA[Lorenz Baermann]]></dc:creator>
		<pubDate>Thu, 18 Jan 2018 15:51:13 +0000</pubDate>
				<category><![CDATA[Blog]]></category>
		<category><![CDATA[IoT]]></category>
		<guid isPermaLink="false">https://info.bkmsaas.net/gen/top5-ur-iot/</guid>

					<description><![CDATA[Based on actual users&#8217; experience with IoT platforms, here are the leading features and functionalities potential users should be looking for. Article published on NetworkWorld by Steven Hilton, Contributor, Jan 16, 2018 As an IoT platform and middleware analyst, I am asked constantly about the&#8230;]]></description>
										<content:encoded><![CDATA[<header class="cat">
<section class="deck viewability">
<h3>Based on actual users&#8217; experience with IoT platforms, here are the leading features and functionalities potential users should be looking for.</h3>
</section>
</header>
<section class="bodee">
<div id="drr-container" class="cat " data-kiosked-context-name="kskdUIContext_4fca20ca97d46817313e579fea1a5495">
<p><em>Article published on <a href="https://www.networkworld.com/article/3247801/internet-of-things/the-top-5-user-requirements-of-iot-edge-platforms.html" target="_blank" rel="noopener">NetworkWorld</a> by <span class="fn"><a href="https://www.networkworld.com/author/Steven-Hilton/" rel="author">Steven Hilton</a></span>, Contributor, <span class="pub-date">Jan 16, 2018</span></em></p>
<p>As an IoT platform and middleware analyst, I am asked constantly about the benefits of IoT platforms and “what makes a great IoT platform.” In response, I often ask these curious inquirers if they’ve ever used IoT platforms themselves. Walking on the edge is exhilarating, but having hands-on insights, data and expertise on how to survive the journey is even better.</p>
<p>What do users actually experience when they use IoT edge platforms?</p>
<p>IoT edge computing is a technology architecture that brings certain computational and analytics capabilities near the point of data generation. IoT edge platforms provide the management capabilities required to deliver data from IoT devices to applications while ensuring that devices are properly managed over their lifetimes. Enterprises use edge platforms for factory automation, warehousing/logistics, connected retail, connected mining and many other solutions. With IoT platform revenue slated to grow to <a href="https://www.machnation.com/2017/09/07/machnation-publishes-iot-platform-application-services-forecast-2017-2026/" rel="nofollow">USD63.4 billion</a> by 2026, IoT edge is one of the most highly relied upon enterprise IoT platform approaches.</p>
<p>Enterprises spend a tremendous amount of time completing edge-related IoT platform activities. According to hands-on tests of IoT platforms in <a href="https://www.machnation.com/mit-e-iot-test-environment/" rel="nofollow">MachNation’s IoT Test Environment (MIT-E)</a>, the majority of an enterprise user’s edge-related time is spent creating visualizations to gain insight from IoT data. 35% of a user’s time is spent creating dashboards with filtered alerts. And a combined 16% of a user’s time is spent viewing sensor data for an individual device (8%) or a group of devices (8%). Data from an IoT platform are critically important, so the ability to assemble dashboard sensor data and alerts are key – expect to spend a lot of time doing it.</p>
<aside id="" class="nativo-promo nativo-promo-1 smartphone"></aside>
<p>Since the edge is critical for enterprises deploying IoT solutions, we’ve identified the top five user requirements of IoT edge platforms, based on IoT platform users’ experiences with these platforms.</p>
<h2>1. Pick a platform with extensive protocol support for data ingestion</h2>
<p>To seamlessly bring data from devices into the edge platform, enterprises should choose leading IoT platforms that support an extensive mix of protocols for data ingestion. The list of protocols for industrial-minded edge platforms generally includes brownfield deployment staples such as OPC-UA, BACNET and MODBUS as well as more current ones such as ZeroMQ, Zigbee, BLE and Thread. Equally as important, the platform must be modular in its support for protocols, allowing customization of existing and development of new means of asset communications.</p>
<h2>2. Ensure the platform has robust capability for offline functionality</h2>
<p>To ensure that the edge platform works when connectivity is down or limited, enterprises should choose leading IoT edge platforms that provide capabilities in four functional areas. First, edge systems need to offer data normalization to successfully clean noisy sensor data. Second, these systems must offer storage to support intermittent, unreliable or limited connectivity between the edge and the cloud. Third, an edge system needs a flexible event processing engine at the edge making it possible to generate insight from machine data when connectivity is constrained. Fourth, an IoT edge-enabled platform should integrate with systems including ERP, MES, inventory management and supply chain management to help ensure business continuity and access to real-time machine data.</p>
<h2>3. Make sure the platform provides cloud-based orchestration to support device lifecycle management</h2>
<p>To make sure that the edge platform offers highly secure device management, enterprises should select IoT platforms that offer cloud-based orchestration for provisioning, monitoring and updating of connected assets. Leading IoT platforms provide factory provisioning capabilities for IoT devices. These API-based interactions allow a device to be preloaded with certificates, keys, edge applications and an initial configuration before it is shipped to the customer. In addition, platforms should monitor the device using a stream of machine and operational data that can be selectively synced with cloud instances. Finally, an IoT platform should push updates over-the-air to edge applications, the platform itself, gateway OSs, device drivers and devices connected to a gateway.</p>
<h2>4. The platform needs a hardware-agnostic scalable architecture</h2>
<p>Since there are tens of thousands of device types in the world, enterprises should select IoT platforms that are capable of running on a wide range of gateways and specialized devices. And these platforms should employ the same software stack at the edge and in the cloud allowing a seamless allocation of resources. Platforms should support IoT hardware powered by chips that use ARM-, x86-, and MIPS-based architectures. Using containerization technologies and native cross-compilation, the platforms offer a hardware-agnostic approach that makes it possible to deploy the same set of functionalities across a varied set of IoT hardware without modifications.</p>
<h2>5. Comprehensive analytics and visualization tools make a big difference</h2>
<p>As we’ve already discussed enterprises should choose IoT platforms that offer out-of-the-box capabilities to aggregate data, run common statistical analyses and visualize data. These platforms should make it easy to integrate leading analytics toolsets and use them to supplement or replace built-in functionality. Different IoT platform users will require different analyses and visualization capabilities. For example, a plant manager and a machine worker will want to access interactive dashboards that deliver useful information and relevant controls for each of their respective roles. Having flexibility in analytics and visualization capabilities will be essential for enterprises as they develop IoT solutions for their multiple business units and operations teams.</p>
<p>Enterprises worldwide are using IoT to increase security, improve productivity, provide higher levels of service and reduce maintenance costs. As they seek to adopt IoT solutions to improve their critical business processes, they should conduct hands-on usability tests to understand edge platform capabilities. Keep watching as more and more enterprises start walking on the edge.</p>
</div>
</section>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>IoT security needs a white knight</title>
		<link>https://bkmsoftware.com/iot-white-knight</link>
		
		<dc:creator><![CDATA[Lorenz Baermann]]></dc:creator>
		<pubDate>Thu, 18 Jan 2018 15:45:33 +0000</pubDate>
				<category><![CDATA[Blog]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[security]]></category>
		<category><![CDATA[securityIoT]]></category>
		<guid isPermaLink="false">https://info.bkmsaas.net/gen/iot-white-knight/</guid>

					<description><![CDATA[It may be time for the U.S. government to step in to coordinate security standards across all the players that participate in creating the internet of things, Frost &#38; Sullivan says Article published on NetworkWorld by Jon Gold, Senior Writer, Jan 15, 2018 Thanks to&#8230;]]></description>
										<content:encoded><![CDATA[<section class="deck viewability">
<h3>It may be time for the U.S. government to step in to coordinate security standards across all the players that participate in creating the internet of things, Frost &amp; Sullivan says</h3>
</section>
<p><em>Article published on <a href="https://www.networkworld.com/article/3247774/internet-of-things/iot-security-needs-a-white-knight.html" target="_blank" rel="noopener">NetworkWorld</a> by <span class="fn"><a href="https://www.networkworld.com/author/Jon-Gold/" rel="author">Jon Gold</a>, </span>Senior Writer, </em><span class="pub-date"><em>Jan 15, 2018</em><br />
</span></p>
<p>Thanks to the <a href="https://www.networkworld.com/article/3136314/security/the-secret-behind-the-success-of-mirai-iot-botnets.html" target="_blank" rel="noopener">Mirai botnet</a> attacks, few people in the world of tech need a reminder that IoT devices remain a serious threat to enterprise networks. Still, more than a year after the botnet made headlines worldwide, IoT security remains mostly an idea, rather than a reality.</p>
<p>Such is the scope of the problem that Frost and Sullivan IoT research director Dilip Sarangan argues for governmental intervention. Sarangan says that, because the responsibility for IoT security is diffused across device manufacturers, network providers, software developers and many others, it’s difficult for the industry to make progress on all-encompassing standards.</p>
<p>“The only entity that has the ability to actually dictate what the minimum threshold is, unfortunately, is the U.S. government,” he said.</p>
<aside id="" class="nativo-promo nativo-promo-1 smartphone"></aside>
<p>The difficulty in creating overarching standards mostly has to do with the fact that any given IoT implementation has a large number of moving parts, each of which may be administered by different organizations, or even by third parties. For example, a set of medical devices provided by company A connecting to a network provided by company B, running an application, originally written by company C and residing in company D’s cloud.</p>
<p>“Everyone talks about it like they’re going to provide end-to-end security, and there’s actually no way to do that,” said Sarangan. “You have no control over a lot of parts of an IoT solution.”</p>
<h3><strong>Network visibility<br />
</strong></h3>
<p>From the networking side, Sarangan said, there are plusses and minuses to most of the options available to any given IoT implementation. Cellular networks, for example, tend to be a lot more secure than Wi-Fi, ZigBee or the other wide-area options, but a company will probably have much more limited visibility into what’s happening on that network.</p>
<p>That, in and of itself, can be a security issue, and it’s imperative for the carriers to provide more robust device management features in the future.</p>
<p>“What type of device it is, what type of information it’s supposed to send, where it’s supposed to send the data, what you are supposed to do with that data – until you know all of that, it’s hard to be completely secure,” said Sarangan.</p>
<p>Improved network visibility is key to preventing worst-case scenarios like malicious actors accessing power grids and Internet infrastructure, but so are common-sense measures like air gaps.</p>
<p>“You have the hacks happening, but the hacks haven’t been significant enough to where you’d worry about it,” he said. “The other side of it is that a lot of critical infrastructure – let’s say a smart grid – is on private networks.”</p>
<h3><strong>A sea of IoT devices</strong></h3>
<p>A lack of quality control and the presence of a host of very old devices on IoT networks might be the most critical security threats, however. Decades-old hardware, which may not have been designed to be connected to the Internet in the first place, let alone stand up to modern-day security threats, creates a serious issue.</p>
<p>“You have over 10 billion IoT devices out there already … and a lot of these devices were created in 1992,” noted Sarangan.</p>
<p>Moreover, the huge number of companies making IoT-enabled hardware makes for a potentially serious problem where quality control is concerned. Big companies like Amazon and Microsoft and Google make headlines for their smart home gizmos, but the world of IoT is a lot broader than that.</p>
<p>China, in particular, is a major source of lower-end IoT devices – speakers, trackers, refrigerators, bike locks and so on – and it’s not just the Huaweis and Xiaomis of the world providing the hardware.</p>
<p>“[There are] hundreds of mom-and-pop shops out there developing hardware that we don’t necessarily know whether to trust or not – these are devices that are getting on unsecured Wi-Fi networks,” said Sarangan. “That’s already a security threat, and a large portion of Americans don’t actually protect their routers.”</p>
<aside id="" class="nativo-promo nativo-promo-2 tablet desktop smartphone"></aside>
<p>Indeed, hidden backdoors have already been found on some such devices, <a href="https://www.theregister.co.uk/2017/03/02/chinese_iot_kit_backdoor_claims/" rel="nofollow">according to The Register</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Basic Rules for Securing IoT Stuff</title>
		<link>https://bkmsoftware.com/basic-rules-for-securing-iot-stuff</link>
		
		<dc:creator><![CDATA[Lorenz Baermann]]></dc:creator>
		<pubDate>Thu, 18 Jan 2018 15:36:35 +0000</pubDate>
				<category><![CDATA[Blog]]></category>
		<category><![CDATA[IoT]]></category>
		<category><![CDATA[security]]></category>
		<category><![CDATA[securityIoT]]></category>
		<guid isPermaLink="false">https://info.bkmsaas.net/gen/basic-rules-for-securing-iot-stuff/</guid>

					<description><![CDATA[Article written by Brian Krebs, published on KrebsOnSecurity the 18th Jan. 2018 Most readers here have likely heard or read various prognostications about the impending doom from the proliferation of poorly-secured “Internet of Things” or IoT devices. Loosely defined as any gadget or gizmo that connects&#8230;]]></description>
										<content:encoded><![CDATA[<p>Article written by <a href="https://krebsonsecurity.com/about/" target="_blank" rel="noopener">Brian Krebs</a>, published on <a href="https://krebsonsecurity.com/2018/01/some-basic-rules-for-securing-your-iot-stuff/" target="_blank" rel="noopener">KrebsOnSecurity</a> the 18th Jan. 2018</p>
<p>Most readers here have likely heard or read various prognostications about the impending doom from the proliferation of poorly-secured “<strong>I</strong>nternet <strong>o</strong>f <strong>T</strong>hings” or <em>IoT</em> devices. Loosely defined as any gadget or gizmo that connects to the Internet but which most consumers probably wouldn’t begin to know how to secure, IoT encompasses everything from security cameras, routers and digital video recorders to printers, wearable devices and “smart” lightbulbs.</p>
<p>Throughout 2016 and 2017, <a href="https://krebsonsecurity.com/?s=mirai+attack&amp;x=0&amp;y=0" target="_blank" rel="noopener">attacks from massive botnets made up entirely of hacked IoT devices</a> had many experts warning of a dire outlook for Internet security. But the future of IoT doesn’t have to be so bleak. Here’s a primer on minimizing the chances that your IoT things become a security liability for you or for the Internet at large.</p>
<p><strong>-Rule #1: Avoid connecting your devices directly to the Internet</strong> — either without a firewall or in front it, by poking holes in your firewall so you can access them remotely. Putting your devices in front of your firewall is generally a bad idea because many IoT products were simply not designed with security in mind and making these things accessible over the public Internet could invite attackers into your network. If you have a router, chances are it also comes with a built-in firewall. Keep your IoT devices behind the firewall as best you can.</p>
<p><strong>-Rule #2:</strong> <strong>If you can, change the thing’s default credentials</strong> to a complex password that only you will know and can remember. And if you do happen to forget the password, it’s not the end of the world: Most devices have a recessed reset switch that can be used to restore to the thing to its factory-default settings (and credentials). <a href="http://krebsonsecurity.com/password-dos-and-donts/" target="_blank" rel="noopener">Here’s some advice</a> on picking better ones.</p>
<p>I say “if you can,” at the beginning of Rule #2 because very often IoT devices — particularly security cameras and DVRs — are so poorly designed from a security perspective that even changing the default password to the thing’s built-in Web interface does nothing to prevent the things from being reachable and vulnerable once connected to the Internet.</p>
<p>Also, many of these devices are found to have hidden, undocumented “backdoor” accounts that attackers can use to remotely control the devices. That’s why Rule #1 is so important.<span id="more-41905"></span></p>
<p><strong>-Rule #3: Update the firmware. </strong>Hardware vendors sometimes make available security updates for the software that powers their consumer devices (known as “firmware). It’s a good idea to visit the vendor’s Web site and check for any firmware updates before putting your IoT things to use, and to check back periodically for any new updates.</p>
<p><strong>-Rule #4: Check the defaults</strong>, and make sure <a href="https://krebsonsecurity.com/2016/10/who-makes-the-iot-things-under-attack/comment-page-3/#comment-413306" target="_blank" rel="noopener">features you may not want or need</a> like UPnP (<a href="https://en.wikipedia.org/wiki/Universal_Plug_and_Play" target="_blank" rel="noopener">Universal Plug and Play</a> — which can easily poke holes in your firewall without you knowing it) — are disabled.</p>
<p>Want to know if something has poked a hole in your router’s firewall? <strong>Censys</strong> has a decent scanner that may give you clues about any cracks in your firewall. Browse to <strong>whatismyipaddress.com</strong>, then cut and paste the resulting address into the text box at <a href="https://censys.io/" target="_blank" rel="noopener">Censys.io</a>, select “IPv4 hosts” from the drop-down menu, and hit “search.”</p>
<p>If that sounds too complicated (or if your ISP’s addresses are on Censys’s blacklist) check out <strong>Steve Gibson</strong>‘s <a href="https://www.grc.com/x/ne.dll?bh0bkyd2" target="_blank" rel="noopener">Shield’s Up page</a>, which features a point-and-click tool that can give you information about which network doorways or “ports” may be open or exposed on your network. A quick Internet search on exposed port number(s) can often yield useful results indicating which of your devices may have poked a hole.</p>
<p>If you run antivirus software on your computer, consider upgrading to a “network security” or “Internet security” version of these products, which ship with more full-featured software firewalls that can make it easier to block traffic going into and out of specific ports.</p>
<p>Alternatively, <a href="https://www.glasswire.com/features/" target="_blank" rel="noopener">Glasswire</a> is a useful tool that offers a full-featured firewall as well as the ability to tell which of your applications and devices are using the most bandwidth on your network. Glasswire recently came in handy to help me determine which application was using gigabytes worth of bandwidth each day (it turned out to be a version of Amazon Music’s software client that had a glitchy updater).</p>
<p><strong>-Rule #5: Avoid IoT devices that advertise Peer-to-Peer (P2P) capabilities</strong> built-in. P2P IoT devices are notoriously difficult to secure, and research has repeatedly shown that they can be reachable even through a firewall remotely over the Internet because they’re configured to continuously find ways to connect to a global, shared network so that people can access them remotely. For examples of this, see previous stories here, including <a href="https://krebsonsecurity.com/2016/02/this-is-why-people-fear-the-internet-of-things/" target="_blank" rel="noopener">This is Why People Fear the Internet of Things</a>, and <a href="https://krebsonsecurity.com/2016/12/researchers-find-fresh-fodder-for-iot-attack-cannons/" target="_blank" rel="noopener">Researchers Find Fresh Fodder for IoT Attack Cannons</a>.</p>
<p><strong>-Rule #6: Consider the cost.</strong> Bear in mind that when it comes to IoT devices, cheaper usually is not better. There is no direct correlation between price and security, but history has shown the devices that tend to be toward the lower end of the price ranges for their class tend to have the most vulnerabilities and backdoors, with the least amount of vendor upkeep or support.</p>
<p>In the wake of <a href="https://krebsonsecurity.com/2017/12/mirai-iot-botnet-co-authors-plead-guilty/" target="_blank" rel="noopener">last month’s guilty pleas by several individuals who created Mirai</a> — one of the biggest IoT malware threats ever — the <strong>U.S. Justice Department</strong> released <a href="https://www.justice.gov/criminal-ccips/page/file/984001/download" target="_blank" rel="noopener">a series of tips on securing IoT devices</a>.</p>
<p>One final note by the author (Krebs): I realize that the people who probably need to be reading these tips the most likely won’t ever know they need to care enough to act on them. But at least by taking proactive steps, you can reduce the likelihood that your IoT things will contribute to the global IoT security problem.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>6 things that prevent Blockchain from ruling the world</title>
		<link>https://bkmsoftware.com/6-things-that-prevent-blockchain-from-ruling-the-world</link>
		
		<dc:creator><![CDATA[Lorenz Baermann]]></dc:creator>
		<pubDate>Thu, 18 Jan 2018 14:48:14 +0000</pubDate>
				<category><![CDATA[Blog]]></category>
		<category><![CDATA[Blockchain]]></category>
		<guid isPermaLink="false">https://info.bkmsaas.net/gen/6-things-that-prevent-blockchain-from-ruling-the-world/</guid>

					<description><![CDATA[By Jonas DeMuro  &#8211; 18th Jan. 2018 &#8211; World of tech   The blockchain is the technology behind Bitcoin (and other cryptocurrencies) which is currently dominating the headlines, due to its meteoric rise over the past month, and the equally massive plunge it has taken&#8230;]]></description>
										<content:encoded><![CDATA[<div class="byline-social">
<p class="byline">By <span class="no-wrap by-author"><a href="http://www.techradar.com/author/jonas-demuro" target="_blank" rel="author noopener">Jonas DeMuro</a></span>  &#8211; 18th Jan. 2018 &#8211;<span class="no-wrap"> <a class="chunk category" href="http://www.techradar.com/news/world-of-tech" target="_blank" rel="noopener"> World of tech </a> </span></p>
</div>
<p>The blockchain is the technology behind Bitcoin (and other cryptocurrencies) which is currently dominating the headlines, due to its meteoric rise over the past month, and the equally massive plunge it has taken this week. Bitcoin is nothing but volatile.</p>
<p>Blockchain tech, on the other hand, is a transparent, distributed digital ledger, that is inherently secure. It has the promise to revolutionize many diverse sectors, including musical digital rights management, secure digital voting, storage of healthcare records, and digital ‘smart’ legal contracts – to name but a few applications. The blockchain is frequently referred to as a disruptive invention, even compared to the very invention of the internet itself.</p>
<p>While blockchain technology offers many advantages, including a high level of security against fraud, and potentially cost-effective transactions, it may not become a storming success and sweep the world off its feet as soon as you might think. As with most fresh technological innovations, it faces an uphill battle towards adoption.</p>
<p>Here are some of the current obstacles that are ‘blocking the blockchain’, as it were.</p>
<h3>1. Energy wastage</h3>
<p>Bitcoin and cryptocurrency mining are highly dependent on GPUs and ASIC miners for profitability. Anyone who has built a computer is aware that GPUs require a robust power supply to function, with a greater amount of power on tap being ideal for stability.</p>
<p>Also note that the security of the Bitcoin blockchain is obviously critical, and must mean that any effort to defraud the system isn’t worth the while, as that effort would be better directed at simply mining the next Bitcoin, as this would be more profitable.</p>
<p>Now, as of December 6, 2017, the energy consumption of Bitcoin mining reached 32.36 Terawatt-hours per year, which is a ridiculous amount of power, and is actually higher than the energy usage of 159 individual countries according to one estimate.</p>
<p>With all this in mind, maintaining data in a blockchain – and keeping it intact and free of fraud – is an inherently energy-inefficient process. In the current era of 6W processors for laptops, deep sleep states for electronics, and solar panels, all aimed at greater energy efficiency and independence, the high energy consumption of blockchain technology and virtual currency mining flies in the face of this.</p>
<h3>2. Data woes</h3>
<p>Generally speaking, the internet is fairly efficient when it comes to the transmission of data. The user requests information, and the server transmits back the piece of data requested with only a small amount of additional data required to get it there.</p>
<p>However, the blockchain, in order for it to be preserved, as well as to prevent hacking, needs multiple copies distributed across many nodes. And the blockchain then requires a large amount of storage – for example, Bitcoin’s blockchain was nearly 150GB in size as of last month, and it’s getting bigger all the time.</p>
<p>Furthermore, transmitting so much data for the blockchain each time also consumes additional electricity, making the blockchain quite inefficient. In a time where efforts are being made to compress video further to decrease the data required for a download, blockchain’s bulkiness makes little sense.</p>
<h3>3. Time for adoption</h3>
<p>While blockchain technology may ultimately work for some sectors, its wider adoption may be a sluggish process, particularly when it comes to industries which are notably set in their ways.</p>
<p>Some sectors – like legal and healthcare – have only just started to move away from paper records, and in some cases still maintain them as backups. They are unlikely to jump to a cutting-edge solution such as the blockchain overnight.</p>
<p>The technology will need to clearly demonstrate advantages and gain a proven track record before this happens, and that could potentially take decades. After all, remember that stock markets held onto their old ticker tapes in the 1970s, after using them from 1867, and the last telegram in the world was sent in 2013.</p>
<h3>4. Centralized may be a good thing</h3>
<p>Bitcoin was developed to be a decentralized cryptocurrency that allows for peer-to-peer transactions. However, this can be a disadvantage, such as when governments cannot track funds easily, and risk losing on the tax side of the equation (which may, potentially, mean that the average taxpayer ends up paying more). It also makes things more challenging when users experience fraud, and recovering funds can be difficult.</p>
<h3>5. Slow transactions with cryptocurrency</h3>
<p>Some tout Bitcoin as the future of currency, and the promise is that peer-to-peer transactions can happen in a fast and cost-efficient manner that can compete with traditional credit cards.</p>
<p>However, Bitcoin transactions are painfully slow, with transactions occurring at the glacial pace (at least in the world of finance) of multiple hours for each transaction in some cases. One of the current reasons for this bottleneck is that each transaction has to be confirmed by six miners.</p>
<p>Obviously enough, this process needs to be sped up significantly for Bitcoin to realistically become a true rival to established methods of buying goods.</p>
<h3>6. Private problems</h3>
<p>Many of the advantages of the blockchain come from its public use – anyone can download the entire blockchain, and mine for additional currency, which democratizes this process.</p>
<p>It also keeps it immune from hackers – with such a large legitimate group dedicated to mining, any fraud attempts would effectively have to ‘out-mine’ the miners, a process that would take a colossal amount of computing power for a popular cryptocurrency. This type of blockchain is known as a public blockchain.</p>
<p>So what about a private blockchain? Well, the same blockchain tech can be applied as a storage medium, and if a company doesn’t want anyone to download the entire blockchain – and no one is going to mine it – then this is kept as a private blockchain. It is also held in a handful of private nodes, rather than distributed across thousands of public nodes as is the case for a public blockchain.</p>
<p>With a private blockchain, while it is more carefully controlled, and far less likely to be hijacked or hacked, it also flies in the face of the whole fundamental idea of this technology – losing the advantages of transparency and wider distribution that make the blockchain tech intriguing in the first place.</p>
<hr />
<p>Article published on <a href="http://www.techradar.com/news/6-things-that-prevent-blockchain-from-ruling-the-world" target="_blank" rel="noopener">TechRadar</a></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>IT Policies</title>
		<link>https://bkmsoftware.com/it-policies</link>
		
		<dc:creator><![CDATA[Lorenz Baermann]]></dc:creator>
		<pubDate>Thu, 13 Jul 2017 13:39:23 +0000</pubDate>
				<category><![CDATA[Blog]]></category>
		<category><![CDATA[consultoría]]></category>
		<category><![CDATA[ITP]]></category>
		<category><![CDATA[Policies]]></category>
		<category><![CDATA[PoliciesConsulting]]></category>
		<guid isPermaLink="false">https://info.bkmsaas.net/gen/it-policies/</guid>

					<description><![CDATA[The importance of ITP Targets and benefits of ITP Today it&#8217;s almost impossible to find organizations of any size not using information technology (IT); moreover, the smaller the company the more it relies on automation or IT services. It is both a powerfull tool for&#8230;]]></description>
										<content:encoded><![CDATA[<h2>The importance of ITP</h2>
<h3>Targets and benefits of ITP</h3>
<p>Today it&#8217;s almost impossible to find organizations of any size not using information technology (IT); moreover, the smaller the company the more it relies on automation or IT services. It is both a powerfull tool for business and, if not properly managed, a certain source of risks and hidden costs.</p>
<p>IT specialists inside organizations are expensive and should not be on payroll if IT is not the core business of the company, but even in those cases, outsourcing is a common practice; the task of harmonization of services and assets is impossible when relying on outsourcing for IT without previously defining criteria and policies for the matter. These policies must address company&#8217;s business rules and specific operational needs. Nonetheless, it&#8217;s a common <em>bad practice</em> in small organizations to <em>bend company operations</em> and processes to adopted software or services instead of all the way around. Fortunately, IT market is full of possibilities and an accurate selection of tools is the best recommended practice.</p>
<h3>The Business IT Alignment</h3>
<p>All companies and situations are different, but every company that uses computers, email, the internet, and software on a daily basis should have IT policies in place: these policies apply both for employees and IT sub-contractors and provide a vision and strategy for such an important asset as technology. Without an internal IT policy the company is subject to the risk of adopting vendor or third-parties policies implicitly. An important secondary benefit of owning an IT policy is the control and indipendence it grants from vendor strategies and vendor-lock-in practices that must be avoided.</p>
<p>Solving these important risks, saving costs, granting business-centric priority and indipendence is what we call the &#8220;<em>Business IT Alignment</em>&#8220;.</p>
<h3>Target and control</h3>
<p>Employees need to know what is expected and required of them when using the technology provided by their employer, and it is critical for a company to protect itself by having policies to govern areas such as personal internet and email usage, security, software and hardware inventory and data retention. It is also important for the business owner to know the potential lost time and productivity at their business because of personal internet usage. Contractors, on the other hand, must operate within the frame of customer&#8217;s IT policies for granting security, compliance, operations and, quality.</p>
<h3>With Baermann</h3>
<p>Although it seems a difficult task it is not indeed when relying on Baermann indipendent approach to the problem and it&#8217;s fundamental for yielding the much need of reliable systems working properly on a daily basis. Baermann helps customers to define these policies minimizing or eliminating unnecessary risks and hidden costs while putting business needs in the right priority of importance.</p>
<h2>Guidelines</h2>
<p>Without written policies, there are no standards to reference when both sticky and <em>status quo</em> situations arise, such as those highlighted above. So, what exactly are the IT policies? There are six areas that need to be addressed, for each area specific guidelines will be documented:</p>
<ol>
<li><strong>AUT &#8211; Acceptable Use of Technology</strong>: guidelines for the use of computers, telephones and mobile devices, internet, company and external email services, voicemail and the consequences for misuse.</li>
<li><strong>Security</strong>: guidelines for passwords, levels of access to the network, internet navigation policies, anti-virus policies, data confidentiality, and storage and usage of data.</li>
<li><strong>Disaster Recovery</strong>: guidelines for data recovery in the event of a disaster, and data backup methods.</li>
<li><strong>Technology Standards</strong>: guidelines to determine the type of software, hardware, and systems will be purchased, supported and used at the company, including those that are prohibited.</li>
<li><strong>Network Set up and Documentation</strong>: Guidelines regarding how the network is configured, how to add new employees to the network, permission levels for employees, and licensing of software.</li>
<li><strong>IT Services</strong>: guidelines to determine how technology needs and problems will be addressed, who in the organization is responsible for technical support, maintenance, installation, and long-term technology planning.</li>
</ol>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
