jeudi 31 octobre 2013

Le marché du livre électronique se structure

A lire sur:  http://www.usine-digitale.fr/article/le-marche-du-livre-electronique-se-structure.N210261#xtor=EPR-4

Par -
Le marché du livre électronique se structure © Amazon
L'essor du livre numérique est indéniable. Editeurs comme chaînes de distribution tentent de grappiller une part d'un marché largement dominé par Amazon. L'arrivée de nouveaux entrants pourrait toutefois changer la donne. En Europe, le marché est en cours de structuration et en passe de rattraper son retard sur les Etats-Unis.
La réputée Foire du livre de Francfort, qui a rassemblé du 9 au 13 octobre plus de 200 000 professionnels du marché du livre, a permis de dresser le bilan du marché européen et des grandes tendances qui se dessinent depuis l'arrivée des livres électroniques et la concurrence des sites comme Amazon. Selon une étude Gartner de 2012, les e-books devraient globalement générer jusqu'à 16 milliards de dollars en 2016. En France, le géant de la distribution Carrefour vient d'annoncer qu'il se lançait dans la bataille en produisant sa propre liseuse, assortie d'une nouvelle librairie électronique. Un moyen de rattraper son retard dans une industrie culturelle qui se digitalise à toute vitesse.
Les Etats-Unis ont adopté le livre électronique, l'Europe est à la traîne
Le marché du livre numérique aux Etats-Unis devrait dépasser les 5 milliards de dollars en 2016, selon l'analyste Michael Wolf. D'après un récent rapport du chercheur Rüdiger Wischenbart, les livres électroniques comptent désormais pour 20% du marché du livre aux Etats-Unis, et sont devenus petit à petit chose courante. Amazon en est l'un des principaux bénéficiaires, à la suite d'un jugement rendu contre Apple en juillet 2013, qui avait été accusé de pratiques anticoncurrentielles. Plus de la moitié des livres vendus sur le continent américain devraient être achetés en ligne l'an prochain, selon Russ Grandinetti, le vice président du contenu Kindle chez Amazon. L'Angleterre talonne les Etats-Unis, et le reste de l'Europe suivra bientôt, selon lui. Toutefois, la transition numérique sur le Vieux continent reste très variable.
Si l'Angleterre et les pays scandinaves s'en sortent bien, le rapport Rüdiger révèle que la France et l'Italie sont à la traîne tandis que l'Espagne et l'Allemagne semblent déjà avoir amorcé un tournant.
En France, malgré un marché du livre très puissant comparé aux autres produits culturels, la part des livres numériques ne représente que 2,1 % du chiffre d'affaires des éditeurs. Pourtant, on compte 25,3 millions de terminaux de lecture, dont 500 000 liseuses et 4,5 millions de tablettes dans l'Hexagone. Mais le livre électronique est toujours considéré comme un bien trop cher. Le taux de TVA réduit à 5,5 % sur les livres électroniques depuis janvier 2013, aligné sur celui des livres physiques, était censé permettre un décollage de la consommation. D'après plusieurs analyses, il n'en serait rien, les éditeurs n'ayant dans leur ensemble pas répercuté la baisse de TVA sur les prix. Affaire à suivre donc.
Le phénomène révèle toutefois que la différence majeure entre le marché américain et européen se situe dans la différence de système fiscal et de taxes.
Les nouveaux modèles qui pourraient changer la donne
Au final, le nerf de la guerre serait une stratégie de prix gagnante pour les éditeurs qui font face au géant Amazon. Ils ont encore dans l'ensemble l'avantage d'un monopole sur les droits des ouvrages. Mais bientôt les auteurs pourraient choisir de négocier directement avec Amazon, et les éditeurs ne sont pas à l'abri d'un brusque changement des règles du jeu. En France, la règle qui impose un non cumul de la réduction de la TVA avec la gratuité des frais de port, était une tentative à peine masquée de freiner Amazon.
Aux Etats-Unis, Amazon, le géant de la distribution culturelle Barnes and Nobles, et Apple, se livrent une guerre sans merci. Les analystes estiment qu'Apple détient aux alentours de 10% du marché des e-books, contre 50 à 60 % pour Amazon et 25% pour Barnes and Nobles et sa tablette Nook. Apple a toutefois affirmé détenir 20% du marché en juin. Deux autres compétiteurs peaufinent leur stratégie : Google avec sa tablette Nexus, assortie de son kiosque Google Play, et la plateforme qui monte, surtout à l'étranger : Kobo. Le producteur de liseuses canadien racheté par le conglomérat japonais Rakuten en 2012 n'a pas fini de faire parler de lui en Europe. "Le meilleur atout de Kobo? Ce n'est pas Amazon", avait écrit le British Observer. Les éditeurs sont à la recherche de toute plateforme pouvant casser le monopole d'Amazon, et de nombreuses start-up l'ont bien compris.
En effet, le modèle d'abonnement type Netflix est en plein essor. Des start-up comme Scribd, Oyster, ou e-Reatah, se présentent comme le "Netflix ou Spotify" de la lecture. En France, la start-up Youboox, sur le même modèle, a levé 1,1 million d'euros en septembre. Toutes ces plateformes offrent un abonnement pour un service de livres électroniques à volonté. Le co-fondateur de Scribd explique l'enjeu : "Netlix vaut à peu près 18 milliards de dollars. Spotify environ 3 milliards. Je ne vois pas pourquoi il n'y aurait pas une opportunité semblable dans le secteur [des livres]."
Toutefois, il n'est pas dit que ce modèle puisse fonctionner. Selon des chiffres portant sur les 6 premiers mois de test du service Scribd, seulement 2% d'utilisateurs lisaient plus de 10 livres par mois. Au final, tous ces services sont en compétition avec les autres formes de divertissement, dans une bataille sans merci pour capter l'attention des consommateurs. Comme le résume Markus Dohle, le patron de Random House Penguin, géant de l'édition : "nous voulons que dans le futur, les consommateurs choisissent les livres, et pas Netflix".
Les Clients sous contrôle
La guerre entre tablettes et liseuses est déclarée. Si le marché des appareils électroniques dédiés à la lecture a explosé au départ, il ralentit maintenant à vue d'oeil. Ce qui n'a pas empêché Carrefour en France de lancer son propre modèle, le Nolimbook, pour équiper 230 de ses magasins. Une manière de concurrencer le monopole d'Amazon. Mais les nouveaux producteurs de liseuses ne luttent pas que contre le Kindle. Tous ont du souci à se faire avec le succès de l'iPad et des autres tablettes qui offrent des services web en supplément, et dont les prix tendent à baisser.
D'après un rapport du cabinet IDC en 2012, les liseuses sont en effet en passe de disparaître. Les commandes de liseuses aux Etats-Unis ont chuté de 28% en 2012. Le centre de recherche Pew Research Center estime que les gens n'ont pas cessé de lire, ils veulent juste des appareils multimédias plus sophistiqués que les liseuses, qui permettent également de surfer sur internet, de prendre des photos ou de jouer à des jeux. Selon Tom Mainelli, le directeur de recherche d'IDC sur les tablettes, "les liseuses vont devenir un produit de niche".
Ce n'est pas encore le cas en Europe, assure Michaël Dahan, co-fondateur de Bookeen, avec lequel Carrefour s'est associé pour le Nolimbook. Le marché des liseuses y connaîtrait une forte croissance, "avec des volumes en hausse de 50 % à 60 % par an", assure-t-il. Reste à voir pour combien de temps.
L'enjeu est de taille, car produire l'appareil, c'est avoir un pouvoir certain sur le consommateur. Selon Jürgen Boos, directeur de la foire du livre de Francfort, Amazon, Apple et autres "sont des machines à fidéliser la clientèle, qui dominent non seulement le commerce en ligne, mais également les supports de lecture. Ils ont ainsi les clients sous leur contrôle". Les géants du web affirment au contraire apporter innovation et choix aux consommateurs...
Nora Poggi

Pour rebondir, la SNCF a l'ambition de devenir "le plus digital des transporteurs"

A lire sur:  http://www.usine-digitale.fr/article/pour-rebondir-la-sncf-a-l-ambition-de-devenir-le-plus-digital-des-transporteurs.N210513#xtor=EPR-4

Par -
Pour rebondir, la SNCF a l'ambition de devenir "le plus digital des transporteurs" © DR
A l’occasion du séminaire annuel de la SNCF face à la presse, la direction a dévoilé les objectifs pour les sept prochaines années dans le cadre de son plan Excellence 2020. Numérique, Ile-de-France et international sont au menu de l’opérateur ferroviaire historique.
"L’horizon est clarifié. La réforme ferroviaire a été adoptée en Conseil des ministres. Le tout TGV est vraiment fini. Priorité au réseau existant et à l’Ile-de-France. Une seule chose n’est pas claire. C’est la concurrence. On attend le quatrième paquet de Bruxelles." Voilà  la situation du ferroviaire française selon le président de la SNCF résumés en quelques mots par Guillaume Pepy, à l’occasion du séminaire presse annuel. La SNCF a défini trois axes essentiels dans son plan de développement pour les prochaines années. "La première priorité est l’Ile-de-France, la seconde est de faire passer l’international de 23 à 30 % et la troisième concerne le développement du digital."
Porte à porte
Le projet d’entreprise se manifeste par une ambition : l’excellence. Il passe par l’accélération du digital. "Nous sommes plutôt bien. Mais on doit être le plus digital des transporteurs", soutient Guillaume Pepy. Surtout, la SNCF modifie son logiciel du voyageur. "Ce ne sera plus un service de gare à gare, mais de porte à porte", a t-on entendu à maintes reprises lors du séminaire lyonnais, au gré des interventions des membres du Comex. La SNCF n’est plus un opérateur ferroviaire, mais un opérateur multimodal qui prend en compte le voyageur d’un bout à l’autre de son trajet. C’est donc un voyage personnalisé et connecté en permanence qui lui est proposé. Le développement de la possibilité de billets NFC incorporés aux smartphone commencera en janvier 2014 sur 7 régions. Il doit être terminé en 2015.
La priorité aux "transports du quotidien" passe également par le numérique avec un Plan haut débit en zone urbaine dense, mais l’élément clé est le réseau francilien. Pourtant, consacrer une grande partie des moyens à l’Ile-de-France ne sera pas la tâche la plus facile. "90 % des élus pensent que tout est fait pour l’Ile-de-France alors que c’est totalement le contraire depuis des années. En dix ans on peut y arriver. La réforme se joue d’abord en Ile-de-France. On n’attendra pas 2015", prévient le président de la SNCF.

La SNCF fait un geste pour la TVA
La TVA sur les transports publics va passer de 7 à 10 %, alors que le précédent gouvernement l’avait déjà remontée de 1,5 point. Une bien mauvaise nouvelle pour les usagers et les collectivités. Une hausse qui pourrait précipiter certains utilisateurs quotidiens des transports publics vers leur automobile qui restait chaudement au garage. Cette hausse de 3 points sera répercutée sur les billets et les abonnements, y compris chez l’opérateur historique. La SNCF va tout de même faire un geste. Barbara Dalibard, directrice générale de SNCF Voyages, a annoncé que "la SNCF renonçait à augmenter les prix en 2014".
Achat de drones La dette doit descendre à 5 milliards d’euros en 2020 (contre plus de 7 milliards aujourd’hui). Cela passe par une croissance de 3 % par an. Vu la situation française, le développement à l’international est inévitable, notamment avec la filiale Keolis, qui espère gagner au moins un gros appel d‘offre au Royaume-Uni dans les prochains mois.
"Tous nos investissements seront financés sans 1 euro de dette supplémentaire", prévient Guillaume Pepy. Une des pistes est de produire moins cher pour vendre plus. Les petits prix devront être plus nombreux. Leur part va doubler. Produire moins cher, c‘est aussi mieux utiliser le matériel, lutter contre les vols. A ce sujet, une décision sera prise avant la fin de l’année 2013 sur l’achat de drones pour surveiller les lignes. Un achat qui pourrait se faire seul ou avec d’autres entreprises comme EDF.
Mais la SNCF doit aussi faire des économies. Cela commence par une baisse des frais de structures de 700 millions d’euros. Mathias Emmerich, le directeur général adjoint aux finances, donne le détail : "300 millions sur les achats qui atteignent actuellement 11 milliards d’euros, 200 millions sur les frais de siège social, établissements et régions, 150 millions d’euros sur les services informatiques et 50 millions d’euros sur l’immobilier". La SNCF devra également économiser 1,3 milliard d’euros à trouver sur chaque activité : vente, services en gare, escales, TER,… et sans doute les effectifs. Si tous les objectifs sont tenus, la SNCF sera prête en 2020 pour affronter la concurrence sans trembler.
Olivier Cognasse

Le fret SNCF sort du rouge écarlate
Le retour à l’équilibre de Fret SNCF, c’est pour 2015-2016. La crise a retardé cet objectif. En 2013, la situation s’est légèrement améliorée. La perte nette ne devrait pas dépasser 300 millions d’euros (450 millions en 2011) et la marge opérationnelle sera de -200 millions. C’est un peu mieux que les dernières années, mais pas de quoi pavoiser… En 10 ans, la dette cumulée atteint 3 milliards d’euros. Pour rendre les résultats moins douloureux, les responsables de la SNCF ne manquent pas de rappeler que la situation est difficile pour tous les opérateurs historiques. En France, les opérateurs privés, malgré un avantage concurrentiel lié à l’organisation du travail, ne gagnent pas non plus d’argent. Dans le projet Excellence 2020, les volumes de frets transportés qui ont atteint leur point le plus bas (22 milliards de tonnes-kilomètres), doivent inverser la tendance avec une progression annuelle de 2 %.

Premières conclusions de l’Etude Métier Data Scientist : une hybridation, pas une révolution

A lire sur:  http://www.decideo.fr/Premieres-conclusions-de-l-Etude-Metier-Data-Scientist-une-hybridation-pas-une-revolution_a6457.html

Thierry Valaud, Socio Logiciels, 16 Octobre 2013

Suite à ma précédente contribution Le data scientist c'est du naming, dans laquelle j’expliquais pourquoi le métier de Data Scientist était pour moi plus qu’un nouveau métier, une évolution naturelle du métier de Data Miner ; j’ai décidé d’interroger des experts reconnus du marché par le biais d’un questionnaire ouvert. Objectif : aller au-delà du mythe et recueillir la vision pragmatique de spécialistes reconnus de la Data. Ces experts ont pour point commun d’avoir déjà utilisé le vocable « Data Scientist » dans leurs propos lors de rencontres ou dans la presse spécialisée.



Thierry Vallaud, Socio Logiciels
Thierry Vallaud, Socio Logiciels
Avant l‘analyse quantitative qui sera dévoilée prochainement, les premiers enseignements dégagés confirment ma précédente prise de position.

Définition du métier Data Scientist : une hybridation entre des métiers déjà connus
La plupart des répondants citent le métier de Data Miner en tant que dénomination précédente du métier de Data Scientist. Les termes de « Statisticien », « Chargé d’études statistiques » ou d’ « Analyste Statisticien » apparaissent également. L’un d’eux précise que cela diffère en fonction du pays. Aux Etats Unis, il s’agirait plutôt « d’informaticiens intelligents au solide bagage scientifique » alors qu’en France les profils seraient plutôt des Statisticiens sortis de formations scientifiques de type Grandes Ecoles.

Des compétences multiples aux socles de connaissance déjà existants
Le socle du Data Scientist est bien pour tous les répondants les statistiques. La notion de compétences multiples est très présente dans les réponses des experts et plusieurs avis concordent sur le fait que le Data Scientist doit disposer d’une triple compétence opérationnelle, à savoir :
• la maitrise des techniques du Data Mining et des Statistiques,
• une appétence aux technologies et aux outils informatiques des bases de données
• un savoir-faire métier dans le domaine d’application des données étudiées (le Marketing est principalement cité).
Un répondant s’interroge sur la notion du Data Scientist omniscient, qui cumulerait à lui seul les trois compétences et suggère que la clé du succès résidera certainement davantage dans le fait de faire collaborer managers, statisticiens et informaticiens.

Des missions précédemment dévolues dans le passé aux data miners
Au quotidien, la répartition des tâches incombant à la fonction de Data Scientist est très variable selon les secteurs, les enjeux, et le niveau d’expérience. Néanmoins, les grandes étapes qui structurent un projet sont citées (toutes ou en partie) par la plupart des répondants :
• Comprendre la problématique métier, les enjeux et les objectifs de l’analyse. Traduire un problème « business » en problème mathématiques/statistiques ;
• Obtenir des données adéquates : trouver les sources de données pertinentes, faire des recommandations sur les bases de données à consolider, modifier, rapatrier, externaliser, internaliser, concevoir des datamarts, voire des entrepôts de données (data warehouses) ;
• Evaluer la qualité et la richesse des données, les analyser et en restituer les résultats. Les intégrer dans le système d’information cible.
Certains répondants font remarquer que la plupart de ces missions étaient déjà dévolues au data miners.

Pas encore de formations dédiées
Les formations préconisées par les répondants sont principalement les écoles d’ingénieurs spécialisées en statistique (de type ENSAI, ENSAE, Télécom), les grandes écoles plus généralistes, de type Centrale, Polytechnique ou la formation continue de la CNAM. Dans tous les cas, le cursus doit associer les statistiques et le Data Mining à l’informatique. Certains confirment que le monde universitaire semble pour l’instant peu adapté (les outils récents n’y sont pas enseignés) et le déconseillent. D’autres préconisent même une formation plus pratique que théorique avec l’autoformation ou un apprentissage du Data Mining « sur le terrain ».

Un avenir prometteur, à condition de miser sur la complémentarité des métiers
Les retours des répondants quant à l’évolution de la fonction de Data Scientist sont plutôt positifs et porteurs d’espoirs. Un expert précise même que « Google annonce que c'est LE métier n°1 d'avenir ». En effet, la plupart des interrogés pensent que la fonction sera « de plus en plus reconnue », « de plus en plus demandée ». Les vrais Data Scientists seront d’ailleurs appelés à exercer des fonctions stratégiques au sein des entreprises. La demande étant croissante, l'écart entre demande et offre va continuer de se creuser et certaines fonctions vont nécessairement évoluer vers la fonction de Data Scientist. En pratique, les Data Scientists vont devoir être plus polyvalents dans les technologies et les outils maitrisés. Ils vont également devoir faire preuve de plus de pédagogie et accroitre encore leur maîtrise du Data Mining. Les volumes de données vont bien entendu augmenter, les types de données vont être de plus en plus hétérogènes, les domaines d’application vont s’élargir, et par conséquent le panel des méthodologies à maîtriser augmentera aussi proportionnellement.
Un spécialiste suggère que les fonctions de Data Management, aujourd’hui assumées par le Data Scientist, devraient pouvoir être confiées à des Data Managers encadrés par un Data Scientist afin que ce dernier consacre moins de temps à la préparation des données et davantage à l’analyse et l’interprétation des résultats.

En réalité, il n’y aurait pas finalement pour certains répondants « un Data Scientist » mais des Data scientists avec des compétences différentes qui devraient s’associer ensemble.

Affaire à suivre donc !

Master Data Management et Big Data, quelles synergies ?

A lire sur:  http://www.decideo.fr/Master-Data-Management-et-Big-Data-quelles-synergies_a6452.html

Pascal ANTHOINE, Micropole
15 Octobre 2013

En 2012, le marché des logiciels de gestion des données de référence (Master Data Management – MDM) a enregistré une croissance de 21 % selon le cabinet Gartner, pour atteindre 1,9 milliard de dollars. Ce dernier prévoit qu’en 2015 le marché dépassera les 3 milliards de dollars. Une progression qui témoigne de l’intérêt croissant des entreprises pour le MDM. À l’heure du Big Data, faut-il y voir une relation de cause à effet ?



Pascal Anthoine, Directeur Practice Entreprise Information Management (EIM) chez Micropole
Pascal Anthoine, Directeur Practice Entreprise Information Management (EIM) chez Micropole
Accéder à une donnée de référence (client, produit, asset…) à la fois unique, intègre et alignée sur les enjeux métiers… Quelle entreprise, dans le contexte actuel, n’en a pas besoin ? De même, l’augmentation du volume des données, la montée en puissance du digital et des données sociales qu’il draine, ainsi que le cloud, poussent les entreprises à accroître la pression sur les services informatiques pour optimiser la gestion, et donc l’utilisation, des données stratégiques. Il est donc légitime de se demander quelles synergies peut-il y avoir entre le MDM et le Big Data… ou tout simplement s’il y en existe. Il y a une réelle volonté d’associer et de lier les données de l’entreprise, et de les enrichir avec les informations semi-structurées voire non structurées issues des canaux sociaux et digitaux. Cette volonté s’accompagne d’une utilisation de statistique inférentielle et d’une analyse prédictive de ces très grosses volumétries de données.

L’objectif : valoriser l’information à une très grande échelle
Obtenir une information rapprochée, consolidée et qualitatif, voilà le véritable challenge des directions informatiques vis-à-vis des Big Data. Mais si on ne structure pas un minimum les Big Data, aucun traitement ne pourra être envisagé et cela deviendra compliqué de les valoriser. Une fois structurés, les Big Data peuvent aussi venir enrichir les données ‘maîtres’ de l’entreprise. Toutefois, aujourd’hui, les projets qui associent les deux sont très rares. La raison ? Il n’existe pas de cadre légal strict concernant notamment les réseaux sociaux. En effet, il est difficile de légiférer sur l’utilisation par les entreprises des données sociales. Comment vont réagir les fans, les internautes face à l’utilisation de leurs données par les entreprises ? Quel est le droit d’usage ?
Aujourd’hui, le principe de précaution est de mise dans les entreprises face au manque de clarté de la CNIL. Mais le besoin de centraliser l’information en provenance des différents canaux reste d’actualité, surtout avec le digital. Pour ce faire, les algorithmes et les techniques de parsing et de data matching sur de très forts volumes peuvent être utilisés.

Décloisonner les canaux
Le Master Data Management permet de garantir des données permettant de rendre les processus clients plus performants. La plupart des projets adressent des problématiques BtoC orientées client (souvent associées au secteur du retail), prennent place dans une stratégie crosscanal et omnicanal et nécessitent une information unifiée.
Le MDM intègre également des projets plus globaux de gouvernance des données de l’entreprise, incluant une refonte du système d’information, pour lesquels une solution intégrée doit être privilégiée (incluant analyse, redressage et gestion des données). En effet, la data quality s’intègre de plus en plus au sein du MDM. C’est pourquoi toutes les solutions fournissent une solution de tableau de bord sur les indicateurs qualités intégrée. À la clé : piloter et réussir à faire travailler ensemble de nombreux métiers de l’entreprise.

Les clés d’un projet MDM
Aujourd’hui, en période de réduction des budgets IT, les projets nécessitent un grand nombre de réunions et un important travail en amont afin d’identifier et valider les gains. Ces gains doivent être identifiés et devront être validés suite au lancement du projet.
De nombreuses entreprises avaient déjà mis en place plusieurs projets ERP : leur projet MDM leur donne l’occasion de rendre l’ensemble cohérent, et ce à un coût moindre.

Les grandes étapes à respecter
· En amont : détecter où se situe l’intérêt véritable à mettre en place un projet de gouvernance grâce à une analyse poussée mais rapide de la situation existante.

· Pendant la phase de cadrage : l’identification des ‘lots’ et la mise en production (en moyenne 6 mois) sur des cycles projets courts ; et l’accompagnement sur la prise en main des entreprises de leurs données.

· En aval : une politique de gouvernance qui soit à la fois opérationnelle et pérenne

23 millions d'Européens ont réalisé en août un achat depuis leur smartphone

A lire sur:   http://www.zdnet.fr/actualites/23-millions-d-europeens-ont-realise-en-aout-un-achat-depuis-leur-smartphone-39794960.htm

Chiffres : Mais selon les chiffres de comScore, le smartphone est d'abord utilisé comme un outil d'aide à la décision d'achat avant d'être un support d'achat.
Petit à petit, le m-commerce prend de l'ampleur, favorisé par l'essor des smartphones et les stratégies cross-canal des marques, notamment à travers les réseaux sociaux.
Selon les chiffres de comScore MobileLens, 14,6% des propriétaires de smartphone en Europe (France, Allemagne, Grande-Bretagne, Espagne Italie), soit près de 23 millions de personnes ont réalisé en août dernier au moins un achat depuis leur terminal.
+37% de m-acheteurs en un an
C'est 37% de plus qu'il y a un an... Au palmarès des produits les plus achetés depuis un mobile, on trouve les vêtements ou accessoires (5,4% de l’audience smartphone), les appareils grand public électroniques ou ménagers (3,8%) à égalité avec livres (sauf les e-books) puis viennent les réservations (tickets) et les produits de soin/hygiène.
Par ailleurs, 20% des utilisateurs européens déclarent avoir consulté au moins une fois une application ou un site marchand depuis un smartphone, c'est 3 points de plus qu'il y a un an. Ce taux atteint 27,6% en Grande-Bretagne mais seulement 11,7% en France (+1,1 point en un an). La faute à des sites de commerce en ligne français mal optimisés pour les smartphones ?
Mais avant l'acte d'achat, le smartphone est avant tout utilisé comme un outil d'aide à la décision. Près d'un quart des mobinautes équipés d'un smartphone ont déjà pris au moins une fois une photo de produit dans un magasin, 14% ont échangé avec amis ou famille autour d'un produit, 11% on scanné le QR-code d'un produit et 8% ont comparé sur leurs terminaux le prix d'un produit.
Selon Forrester cette fois, les revenus du m-commerce devraient passer de 1,4 milliards d’euros en 2011 à 19,2 milliards en 2017. A cette date,on comptera 79 millions de mobinautes ayant effectué un achat sur mobile contre 7,6 millions en 2011. Concrètement, cela représentera en 2017 presque 7% du volume total des ventes sur le web.

Cloud computing : les contraintes organisationnelles représentent toujours un frein

A lire sur:  http://www.atelier.net/trends/articles/cloud-computing-contraintes-organisationnelles-representent-toujours-un-frein_424724

Par 21 octobre 2013 Cloud developement_L'Atelier

Le marché du cloud s’apprête à mûrir très rapidement. Cependant, des préoccupations relatives à la sécurité, à l’intégration des processus internes ou la fiabilité du service demeurent.
Si le second semestre de 2012 a montré une forte progression des déploiements de cloud interne, 2013 serait quant à elle marquée par l’évaluation des fournisseurs pour les projets d’IaaS ou de  Saas selon l’étude du cloud computing établie par 451 Research.  Car en effet, si les déploiements des services en ligne des entreprises auront migré dans leur ensemble d’ici deux ans, le marché des déploiements hors site reste atone. On ne note ainsi qu’une progression d’un point à 13% de l’externalisation des services traditionnels. Faute en premier lieu  aux datacenters qui, pour être fiables, doivent encore montrer des qualités quant à la souplesse de leur fonctionnement, leur automatisation et leur adaptabilité. Mais surtout faute à des contraintes de type organisationnel.

Les barrages au déploiement des architectures traditionnelles

En effet, les problèmes de maturité technologique des entreprises n’apparaissent pas comme la première cause au frein des déploiements. Deux tiers des répondants de l’étude avancent plutôt des contraintes qui ne sont pas liées à l’informatique. En effet, l’organisation interne et les budgets enregistrent 37% de ce type de craintes, la confiance dans les solutions 16%. Politiques de sécurité, questions relatives à la conformité avec  la régulation et manque de temps à accorder au  sujet sont également prépondérantes. Ainsi, des fonctions comme la gestion de la performance, la facturation transversale entre le cloud et l’interne, les services de stockage ou la mise en place de réseaux automatisés ne rentrent pas ou peu dans les projets actuels. Aujourd’hui les solutions de cloud déployées sont en fait celles qui cachent l’iceberg des changements rendus possibles par cette technologie.

Les acteurs d’interfaces B-to-C  monopolisent les budgets

La plupart des solutions basées sur le cloud computing couvrent des applications dédiées à la clientèle. Les stores d’applications d’entreprises ainsi que la mise à disposition de l’offre en ligne possèderaient un potentiel important de déploiements dans les 2 prochaines années. Une part importante des budgets dédiés au cloud computing y sera donc affectée. La médiane de ces budgets atteint 675 000 dollars mais 16% des répondants ont déclaré des investissements au-dessus de la barre des 10 millions de dollars. Pléthore de fournisseurs de solutions se disputent ces budgets. Microsoft, VMware et Amazon.com sont des acteurs déjà bien installés mais des acteurs comme Openstack pourraient bien faire évoluer la donne rapidement

Un archivage informatique qui résisterait... un milliard d'années

A lire sur:  http://www.futura-sciences.com/magazines/high-tech/infos/actu/d/high-tech-archivage-informatique-resisterait-milliard-annees-49719/#xtor=EPR-17-[QUOTIDIENNE]-20131022-[ACTU-Un-archivage-informatique-qui-resisterait----un-milliard-d-annees]

Si la densité du stockage des données numériques a beaucoup progressé ces dernières années, la longévité des supports reste le talon d’Achille. Une équipe de chercheurs s’est penchée sur les propriétés thermiques du tungstène, et a créé un disque optique susceptible de survivre jusqu’à un milliard d’années. Jeroen de Vries, l’un des scientifiques ayant participé au projet, a répondu aux questions de Futura-Sciences.

Ce disque optique, créé par une équipe de chercheurs, se présente comme une galette de tungstène recouverte d’une couche de nitrure de silicium dont les propriétés de transparence facilitent la lecture. Les données sont gravées sous forme de codes QR de tailles différentes. Chaque code en renferme un autre sur le principe des poupées gigognes. Cela permet d’atteindre une densité de stockage potentiellement élevée. © Institute for Nanotechnology, Université de Twente

Un nouveau supraconducteur découvert d'abord par ordinateur

A lire sur:  http://www.futura-sciences.com/magazines/matiere/infos/actu/d/physique-nouveau-supraconducteur-decouvert-abord-ordinateur-49615/#xtor=EPR-17-[QUOTIDIENNE]-20131022-[ACTU-Un-nouveau-supraconducteur-decouvert-d-abord-par-ordinateur]

On s'en doutait depuis quelques années, c’est maintenant prouvé : il est possible de concevoir de nouveaux supraconducteurs ab initio sur ordinateur. Une équipe de chercheurs vient de le montrer en synthétisant l'un de ceux conçus par les physiciens. Cependant, il ne s'agit encore que d'un supraconducteur conventionnel.

Un aimant en lévitation magnétique au-dessus d'un supraconducteur à haute température critique. C'est l'expulsion du champ magnétique du matériau supraconducteur (effet Meissner) qui est responsable de cet effet de lévitation. On est peut-être sur le point de concevoir de tels matériaux exotiques avec des ordinateurs. © Mai-Linh Doan, Wikimedia Commons, GNU 1.2
La supraconductivité a été découverte il y a plus de 100 ans, le 8 avril 1911. Elle a fasciné bien des physiciens, comme Vitaly Ginzburg et Pierre-Gilles de Gennes, et a donné lieu à l’attribution de plusieurs prix Nobel. Mais de nos jours, c’est très probablement le rêve d’obtenir des supraconducteurs à température ambiante (on soupçonne que c'est possible avec du graphite) qui motive bon nombre de recherche sur les matériaux supraconducteurs.
Cela déboucherait sur une révolution technologique majeure, et pas seulement parce que l’on pourrait transporter de l’électricité sans perte. Un train hypersonique pourrait alors relier Kiev et Pékin en une heure. Un tokamak comme Iter, qui nécessite des aimants supraconducteurs, pourrait sans doute aussi bénéficier de la découverte de matériaux supraconducteurs à température ambiante.

Du bore et du fer pour un supraconducteur

Malheureusement, depuis la découverte en 1986 des supraconducteurs à haute température critique qui avait suscité de nombreux espoirs, ce rêve ne s’est pas réalisé. Une des raisons en est que nous ne comprenons toujours pas vraiment ce qui se passe dans ces matériaux. De manière générale, toute connaissance plus fine de ce qui rend un composé supraconducteur peut aider les physiciens dans leur quête.
Le tokamak du CEA, Tore Supra utilise des aimants supraconducteurs refroidis à l'hélium liquide. Des matériaux supraconducteurs à température ambiante, s’ils étaient découverts un jour, pourraient sans doute faciliter la construction de réacteurs à fusion similaires.
Le tokamak du CEA, Tore Supra utilise des aimants supraconducteurs refroidis à l'hélium liquide. Des matériaux supraconducteurs à température ambiante, s’ils étaient découverts un jour, pourraient sans doute faciliter la construction de réacteurs à fusion similaires. © CEA
Voilà environ quatre ans, le physicien Aleksey Kolmogorov a écrit un programme baptisé Maise (ce qui signifie la beauté, la grâce, l'élégance en gaélique écossais), pour Module for Ab Initio Structure Evolution. Il permet de découvrir des structures cristallines stables dans des matériaux à base de bore. En 2010, il s’est rendu compte avec ses collègues qu’une certaine phase cristalline de tétraborure de fer (FeB4) devait non seulement être stable, mais surtout supraconductrice. Cerise sur le gâteau, il était même possible d’estimer la température critique en dessous de laquelle le phénomène de supraconductivité devait apparaître : environ 20 K. Les chercheurs n’étaient cependant pas les premiers à découvrir potentiellement un nouveau supraconducteur avec un ordinateur.

Supraconducteur superdur

Beaucoup de spécialistes du domaine étant sceptiques, Kolmogorov a entrepris de synthétiser le nouveau matériau. Pour cela, il a notamment fait équipe avec Natalia Dubrovinskaia et ses collègues de l’université de Bayreuth, en Allemagne. La chercheuse est célèbre pour avoir obtenu il y a quelques années un matériau à base de nitrure de bore presque aussi dur que le diamant. Les physiciens se sont alors lancés dans une série d’expériences faisant intervenir de hautes pressions de l’ordre de 8 GPa et des températures élevées. Au bout d’un an de travail, ils ont réussi à produire un petit échantillon de tétraborure de fer.
Le matériau obtenu était bien supraconducteur, mais à une température d’environ 3 K, et non 20 K comme attendu. Dans l'article qu'ils ont publié sur arxiv, les chercheurs expliquent cette différence par la présence de défauts dans le cristal de FeB4. Enfin, le tétraborure de fer s’est révélé avoir une dureté exceptionnelle, ce qui n’avait pas été prévu.
« La découverte de ce supraconducteur superdur démontre que de nouveaux composés peuvent être créés en revisitant des matériaux apparemment bien étudiés, se réjouit Kolmogorov. Maintenant que ce matériau a été synthétisé, il est peut-être possible de le modifier pour augmenter la température à laquelle il devient un supraconducteur. »

Informatique personnelle : le PC sera le seul marché à décroître en 2013

A lire sur:  http://www.distributique.com/actualites/lire-informatique-personnelle-le-pc-sera-le-seul-marche-a-decroitre-en-2013-20912.html


Evolution des ventes de produits d'informatique personnelle entre 2012 et 2014. Cliquez sur l'image pour l'aggrandir Evolution des ventes de produits d'informatique personnelle entre 2012 et 2014. Cliquez sur l'image pour l'aggrandir
Les ventes mondiales d'ordinateurs classiques devraient baisser de plus de 10% cette année. Un chiffre qui freine la progression globale du marché de l'informatique professionnelle (PC, ultramobiles, smartphones, tablettes) dont le cabinet d'étude Gartner attend une progression en volume de 4,5% cette année.
Selon la dernière étude de Gartner sur l'informatique personnelle, les ventes cumulées de PC classiques, d'ordinateur ultra mobiles, de tablettes et de smartphones atteindront 2,31 milliards d'unités dans le monde en 2013. C'est 4,5% de mieux que l'an dernier. Parmi ces quatre segments de produits, tous devraient contribuer positivement à la croissance du secteur, à l'exception et sans surprise, de celui des PC traditionnels (notebook et PC).

Dans le détail, le cabinet d'études anticipe en effet une contraction des ventes de PC classiques de l'ordre de 11,2% pour 303 millions d'unités. Et malgré la progression que devraient connaître les livraisons d'ordinateurs ultramobiles (+90% à 18,6 millions d'unités), celle-ci ne suffira pas à faire passer les ventes globales de PC dans le vert. Tous types de catégories confondues, les ventes d'ordinateurs chuteront en effet de 8,4%.

+53% de croissance pour les tablettes

Comparée à  la situation du marché des PC traditionnel, celle des tablettes est nettement plus satisfaisante pour les fabricants. En 2013, Gartner anticipe des ventes en croissance de 53,4%, soit 18,6 millions d'unités à travers le monde. Un bémol toutefois, le prix moyen des tablettes haut de gamme connaît une baisse continue sur le segment des produits 7'' à mesure qu'un nombre croissant de consommateurs développe un intérêt pour des matériels de taille plus réduite. A l'heure actuelle, la taille moyenne des écrans de tablettes en circulation est comprise entre 8,3 et 9,5''. 

Sur le marché des smartphones, la capacité des fabricants à proposer un prix moyen élevé est bel et bien révolue. Selon Gartner, ce sont les téléphones de milieu de gamme qui tireront la croissance des ventes sur les marchés matures tandis que les ventes de mobiles d'entrée de gamme sous Android seront le facteur stimulant dans les pays émergeants. Ces éléments ne sont pas de nature à tirer vers le haut la croissance en valeur du marché des smartphones. Sa progression en volume pour 2013 est attendue à +3,3% à 1,8 milliard d'unités, selon Gartner.

Par F.A.

Les fournisseurs de mémoire vive renouent avec la rentabilité

A lire sur:  http://www.itchannel.info/articles/144164/fournisseurs-memoire-vive-renouent-rentabilite.html

Lundi 21 Octobre 2013
Malgré la baisse des ventes de PC, les fabricants de mémoire vive engrangent de nouveau de jolis profits.

Leur marge opérationnelle a, en effet, atteint 27% entre avril et juin, après avoir dépassé les 11% au cours du premier trimestre. Et ce, principalement en raison de la hausse des prix moyens de vente (+4% sur le premier trimestre et +12% sur le second) et d’un réajustement pertinent de la production opéré par les trois principaux acteurs du marché.

C’est le sud-coréen SK Hynix qui s’est d’ailleurs montré le plus efficace dans le contrôle de sa production et le plus rentable avec une marge de 33%, suivi par le japonais Elpida (32%) et Samsung (28%).

Plusieurs fournisseurs, dans la tourmente au premier trimestre, ont aujourd'hui redressé la barre dont Micro Technology (12%), Inotera (27%) et Winbond Electronics (6%). S’ajoutent à ce trio, deux autres producteurs taïwanais : Nanya Technology (1%) et Powerchip Technology (29%).

Notez que pour la première fois en 30 ans, les fabricants de mémoire réalisent moins de 50% de leurs chiffres d’affaires avec les modules à destination des PC.


mercredi 30 octobre 2013

La performance des applications en entreprise reste un enjeu de productivité majeur

A lire sur:  http://www.atelier.net/trends/articles/performance-applications-entreprise-reste-un-enjeu-de-productivite-majeur_424708

Par 21 octobre 2013 app

La pénétration des applications dans les habitudes de travail des entreprises va de pair avec une augmentation des problèmes inhérents à leur utilisation.
Si les avantages apportés par les applications dans la réalité du travail ne prêtent pas à polémique, on a vu ainsi que les rares arguments qui freinent l'intégration de ces outils de travail sont avant tout centrés sur des problématiques de sécurité, leur utilisation systématique, structurante, prête le flanc à certains risques associés. En dehors des problématiques de sécurité, le simple fonctionnement de ces applications devenues indispensables n'est pas garanti, et ralentissement et erreurs du système créent de réels coûts auprès des entreprises. 79% des entreprises interrogées, européennes comme américaines, rapportent ainsi l'aléatoire des performances des applications comme un enjeu de productivité majeur, et ce malgré l'augmentation continue des budgets alloués aux services informatiques. C'est sur ce chiffre que s'ouvre l'étude Killer Apps 2013 réalisée par Ipanema et Easynet, montrant la transition qui s'opère entre l'intégration de nouveaux outils et l'attention portée à l'impact réel des outils déjà en place.

Une problématique au coeur du fonctionnement

Le problème de la performance des applications, au fur et à mesure que celles-ci prennent une place prépondérante au sein du travail, devient de plus en important, ainsi 54% des entreprises interrogées admettent une augmentation nette de problèmes de ralentissement ou de non-réponse durant cette année. 18% soulignent même le caractère très fréquent de ces difficultés. Ce sont particulièrement les applications liées au business et à la vidéo qui semblent les plus vulnérables à ces défauts techniques, représentant à elles deux plus de 60% des problèmes enregistrés. Les applications de business à elles seules représentent 45% des difficultés liées aux applications des entreprises américaines. Le principal enjeu qui émerge de ces chiffres tient au fait que ces applications sont dans la majorité des cas intégrés dès la création d'un projet et de ce fait leurs éventuelles difficultés de performance impactent l'intégralité de la chaîne de fonctionnement, ce sont de cette façon 86% des entreprises américaines qui intègrent déjà des applications vidéos dans leur fonctionnement.

Comment répondre à l’accroissement des données?

L'augmentation du nombre d'applications n'est cependant pas seulement une volonté des entreprises de se moderniser mais répond à un accroissement quantitatif des données à traiter. 40% des entrepreneurs attendent ainsi une augmentation de plus de 20% des données à analyser par an, et de plus de 10% pour le reste des interrogés. Cette attente va de pair avec une augmentation significative des budgets alloués aux services informatiques, 52% des interrogés rapportent ainsi avoir vu les budgets informatiques enfler, contre un peu moins de 30% l'année précédente. Il semble ainsi que les entreprises font face à deux enjeux majeurs au niveau des applications, d'un côté rationaliser et maximiser l'efficacité des dépenses informatiques, mais aussi continuer à suivre l'évolution des besoins, notamment en termes de bande passante, pour assurer la performance des applications déjà présentes.
 

Le marché de la maison connectée passe par un développement multicanal

A lire sur:  http://www.atelier.net/trends/articles/marche-de-maison-connectee-passe-un-developpement-multicanal_424690

Par 18 octobre 2013 Cloud et maisons connectées

Pour améliorer la gestion de l’énergie au quotidien, les secteurs Telecom, Energétique et équipementier s’allient avec des spécialistes du cloud.
Le marché de la gestion énergétique domestique intelligente(Home Energy Systems) est valorisé à plus de 1,5 milliards de dollars en 2013 selon un rapport de Green Tech Média. La spécificité de ce marché réside dans la transversalité des acteurs investis, constructeurs, télécoms et producteurs énergétiques tentent tous de bénéficier de la croissance de ce marché encouragé par les pouvoirs publics. Pour accélérer la transition vers un mode de consommation plus économe généralisé, ces acteurs traditionnels collaborent de plus en pus avec des startups utilisant la technologie cloud et utilisent des outils de collaboration (API) pour le développement de nouveaux services sans passer par une rénovation trop lourde du parc immobilier existant.

​Valoriser les données domestiques disponibles ​ 

EcoFactor leader des services de réduction énergétique fonctionnant grâce au cloud vient de lever plus de 10 millions de dollars notamment auprès de NRG Energy un leader de l’énergie verte aux Etats-Unis. L’entreprise utilise des algorithmes et logiciels adaptés pour réduire la facture énergétique domestique de ses utilisateurs. EcoFactor valorise ainsi les données générées en permanence par les foyers afin de mieux accompagner la transition de ces ménages vers une consommation plus modérée sans altérer drastiquement leur confort au quotidien. Ainsi différentes sources sont privilégiées, notamment les thermostats connectés, les données météorologiques, ou encore les habitudes de consommation, celles-ci sont ensuite traitées par des algorithmes d’optimisation visant à personnaliser le suivi des consommateurs.

La maison intelligente fruit d’une collaboration entre différents acteurs

Le marché de  la maison intelligente se situe au croisement de différentes industries : services, énergie, sécurité, télécom ou encore fournisseurs de câble.   Alors que le marché est encore extrêmement volatile et dispose d’un grand potentiel de développement, Green Tech Media envisage un doublement représentant jusqu’à 4 milliards de dollars en 2017.  Ce développement  passe déjà par des collaborations ponctuelles ou stratégiques entre des leaders de différents secteurs. L’intérêt de ces partenariats étant de mettre en place une transition plus rapide sans passer par l’achat d’un nouveau device par les utilisateurs, ainsi les clients de Comcast utilisent le même thermostat EcoSaver pour bénéficier des services d’EcoFactor. Suivant une même logique, Comcast a également noué un partenariat avec un fabricant d’ampoules connectées pouvant potentiellement réduire jusqu’à 83% de la facture électrique moyenne offrant une durée de vie 18 plus élevée que les ampoules incandescentes.
 

La France numérique, un territoire inégal

A lire sur:  http://www.atelier.net/trends/articles/france-numerique-un-territoire-inegal_424692

Par 18 octobre 2013 France numérique

Un rapport public pointe les disparités de développement et d'investissement numérique selon la structure géographique française.
Rédigé par Claudy LeBreton, le rapport intitulé Les territoires numériques de la France de demain, remis à la ministre de l'Egalité des Territoires et du Logement Cécile Duflot entreprend la lourde et complexe tache de proposer aux acteurs publics et privés un panorama détaillé et précis des défis de l'aménagement territorial en matière de technologies numériques. Ce rapport fait suite à l'engagement pris par le gouvernement de mettre en place d'ici 10 ans le très haut débit partout en France. Cependant si un peu moins de 10% du territoire français n'est pas aujourd'hui éligible, techniquement, pour la mise en place du très haut débit, la réalité pratique voit, dans l'accès au numérique, se creuser les inégalités déjà existantes, géographiques comme sociologiques. Claudy Lebreton pose dès l'ouverture l'importance du défi numérique, non comme un outil mais comme un facteur essentiel à la révolution technologique en cours, "Ce que les penseurs du Siècle des Lumières ont réussi avec le livre imprimé, nous devons maintenant l'envisager pour le numérique."

Les inégalités numériques suivent les inégalités sociales

Si une politique publique d'investissement existe depuis le début des années 90, l'approche jusque là s'est cantonnée à l'achat d'équipement et la mise en place d'une infrastructure pouvant supporter le haut débit. Or Il ne s'agit pas seulement de créer une structure propice mais bien d'accompagner la transition numérique selon le député. Le rapport pointe ainsi une réticence de la société française aux transformations numériques, malgré la reconnaissance de leur utilité. Le constat auprès des territoires renforce cette impression. Le nombre d'internautes, en moins de 10 ans, s'est vu multiplié par 4, atteignant aujourd'hui 41 millions en France, et déjà plus de 24 millions d'utilisateurs mobiles. De même s'est opéré une évolution sociétale dans l'importance prise par Internet comme outil, celui-ci étant maintenant par exemple le 3e outil, selon les sondés, le plus utile dans la recherche d'un emploi, à 63%, contre seulement 56% pour Pôle Emploi. Or si 80% de la population y a accès, contre 14% en 2000, la répartition n'est pas égale. Ce que Claudy Lebreton nomme l'e-exclusion creuse ainsi un peu plus les trois grands fossés : générationnel d'abord puisque si les digital natives sont pourvus, seuls 47% des 60-74 ans et 10% des plus de 75 ans possèdent un ordinateur. Social de même avec 59% des foyers disposant d'un revenu inférieur à 900€ ayant un accès propre, culturel enfin ou la différence entre diplômés ou non est lourde, 93% pour les études supérieures, à peine plus de 30% pour les non diplômés. On retrouve cet écart dans la notion d'isolement, isolement social d'abord montrant que 65% des plus isolés n'utilisent pas Internet, et enclavement géographique. La carte du territoire montre ainsi 35 zones sur la France entière, centrée autour des villes les plus importantes, disposant d'un réseau cablé supérieur à 100mbits/s. La ligne Nord-Sud est d'autant plus significative, mettant en lumière un vide numérique entre Perpignan, Paris, et Lille.

"Le logiciel dévore le monde" Mark Andreesen

Or ce creusement des inégalités s'avère être une difficulté majeure, économique aussi bien que sociale, pour la population française. Là où le numérique, en abolissant les distances, permet un désenclavement sensible, les bassins de vie ruraux, tout comme la périphérie urbaine manquent encore d'infrastructure suffisante. De même le manque d'accompagnement dans l'utilisation du numérique donne des résultats inégaux, ainsi on retrouve dans l'apport du numérique une reproduction sociale de l'héritage culturel, les plus aisés en apprenant plus que les autres au sein même de la génération digitale. Le rapport met ainsi en lumière l'impact potentiel des capacités numériques, depuis la valorisation des atouts touristiques au soutien scolaire en passant par la revitalisation des territoires isolés et de l'économie résidentielle grâce au télétravail. Le numérique serait ainsi un atout fort pour les territoires, en réduisant la pertinence des atouts naturels ou démographiques, en dématérialisant la production, le besoin de la "ville", au sens traditionnel du terme, perd de son importance.

How to balance maintenance and IT innovation

A lire sur:  http://www.computerworld.com/s/article/9243312/How_to_balance_maintenance_and_IT_innovation

Many IT leaders admit their spending is too heavily weighted toward keep-the-lights-on projects. Here's how to tip the balance.

By Minda Zetlin
October 21, 2013 06:00 AM ET

Computerworld - Social! Mobile! Big data! BYOD! You probably already know what your company's executives most want to see from your IT organization. But unless your company is very new, or you're unusually lucky -- or a very, very good manager -- more than half your time and resources are spent, not on innovative projects, but on "keep the lights on" activities whose sole purpose is to prevent existing systems from breaking down. And sometimes the percentage is a lot higher than that.
"I've seen companies where it's 80% or 90% of the IT budget," says Columbia Business School professor Rita Gunther McGrath, who examined this issue for her book The End of Competitive Advantage: How to Keep Your Strategy Moving as Fast as Your Business. "I think it should be no more than 50%," she adds.
Most CIOs would agree with her, but can't achieve that 50-50 split in their own budgets. In a recent Forrester Research survey of IT leaders at more than 3,700 companies, respondents estimated that they spend an average 72% of the money in their budgets on such keep-the-lights-on functions as replacing or expanding capacity and supporting ongoing operations and maintenance, while only 28% of the money goes toward new projects.
Another recent study yielded similar findings. When AlixPartners and CFO Research surveyed 150 CIOs about their IT spending and their feelings about IT spending, 63% of the respondents said their spending was too heavily weighted toward keeping the lights on.

Why So Difficult?

If no one wants to spend such a huge portion of IT's funds just to run in place, why does it keep happening? One explanation lies in the term "keeping the lights on" itself: Turning the lights off isn't an option. "It's the ante that allows you to hold on to your job," says Eric Johnson, CIO at Informatica, a data integration company in Redwood City, Calif., with annual revenue of $812 million. "If the systems are down and the phones aren't working, no one will care how innovative you are."
Of course, new projects are very important, so the challenge is to do both. "CIOs are striving to be business executives, truly driving value for the organization," Johnson says. "That's why there's so much emphasis on keeping the lights on while still finding the budget to drive innovation."
A bigger problem has to do with the traditional approach to IT at most companies, where techies who are expected to abide by the principle that "the customer is always right" find themselves creating unwieldy systems in an ongoing effort to give the business exactly what it asks for. Keeping those systems running is usually difficult, time-consuming and expensive. "I've worked with a lot of companies where the CEO says, 'I want you to do this, this and this.' The CIO says, 'That'll be $5 million.' The CEO says, 'Do it for $3 million.' So it's patch, patch, patch," McGrath says. That approach creates "technical debt" -- something you'll have to go back and pay for later -- according to Bill Curtis, chief scientist at CAST, a software analysis company headquartered in Meudon, France, with annual revenue around $47 million.
Similar problems arise when IT tries to satisfy business needs too quickly. "Sometimes these things were built as 'Let's just get something up and see how it works,'" Curtis says. "Things that were designed as a demo suddenly have to grow. Or even if something was designed appropriately for what they thought would be the use, people kept adding new requirements and features until it became a kludge."
Perhaps worst of all is the tendency to customize licensed software in an effort to fulfill business requirements -- whether or not those requirements have any real bearing on the organization's goals or success. "We talk about business capability -- the list of things a business needs to do to be successful and achieve its goals," says Nigel Fenwick, an analyst at Forrester Research. "Out of 30 high-level capabilities, maybe two or three are differentiators." When senior executives understand this well, he says, they encourage IT to focus on those key areas and seek standardized, easy-to-maintain solutions for everything else.

mardi 29 octobre 2013

Magic Quadrant for Master Data Management of Customer Data Solutions

A lire sur:  http://www.gartner.com/technology/reprints.do?id=1-1LWGEO3&ct=131017&st=sb

17 October 2013 ID:G00251784
Analyst(s): Bill O'Kane, Saul Judah

VIEW SUMMARY

The MDM of customer data solutions market segment grew healthily in 2012. New acquisitions and integrations of prior acquisitions by the Leaders have continued, and several visions for linking MDM and social data have emerged. This Magic Quadrant will help you find the right vendor for your needs.

Market Definition/Description

Markets are sets of potential buyers that view a product as solving a common, identified need, and that reference each other. Market segments are portions of a market that are qualified by more exact criteria, thus grouping potential buyers more tightly. Segmentation may take two forms:
  • A generic market may be divided into recognizable submarkets, where the same rules prevail for defining a market.
  • An individual vendor may segment a market to target its products more precisely and differentiate itself from (or avoid competing with) other players that address the same overall market. However, the targeted buyers may not know they are part of the same market segment. Such segmentation will not be reflected explicitly in this Magic Quadrant, although it may be reflected implicitly — for example, via placement of a vendor in the Niche Players quadrant.
Master data management (MDM) is a technology-enabled discipline in which business and IT teams work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of their enterprise's official, shared master data assets. Master data is the consistent and uniform set of identifiers and extended attributes that describes the core entities of an enterprise, such as customers, prospective clients, citizens, suppliers, sites, hierarchies and the chart of accounts.
MDM of customer data solutions are software products that:
  • Support the global identification, linking and synchronization of customer information across heterogeneous data sources through semantic reconciliation of master data
  • Create and manage a central, database-based system of record or index of record for master data
  • Enable the delivery of a single customer view (for all stakeholders) in support of various business benefits
  • Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective action techniques
MDM implementations and their requirements vary in terms of:
  • Instantiation of the customer master data — varying from the maintenance of a physical "golden record" to a more virtual, metadata-based, indexing structure
  • The usage and focus of customer master data — ranging across use cases for design (information architecture), construction (building the business), operations (running the business) and analytics (reporting the business)
  • Different organizations' structures spanning small, centralized teams through to global, distributed organizations
  • The latency and accessibility of the customer master data — varying from real-time, synchronous reading and writing of the master data in a transactional scenario between systems, to message-based, workflow-oriented scenarios of distributed tasks across the organization, and legacy-style batch interfaces moving master data in bulk file format
Organizations use MDM of customer data solutions as part of an MDM strategy, which in itself should be part of a wider enterprise information management (EIM) strategy. An MDM strategy potentially encompasses the management of multiple master data domains, such as customer, product, asset, person or party, supplier and financial masters. As the name suggests, MDM of customer data solutions focuses on managing customer data — a form of "party" data, whereas MDM of product data focuses on managing product data — a form of "thing" data. There are no discrete Magic Quadrants for other master data domains due to the relatively low level of interest in discrete solutions to govern those data domains in comparison with the customer and product data domains.
We are routinely asked whether we have an overall MDM Magic Quadrant. We do not. We still believe that such a Magic Quadrant would be premature, because MDM needs are very diverse (see "The Five Vectors of Complexity That Define Your MDM Strategy"), leading to different market segments, and most evaluation and buying activity still focuses on initiatives for specific master data domains. In addition, although many MDM solutions are marketed as "multidomain MDM," they do not always conform to our definition of multidomain MDM technology (see Note 1) and we find that they have many gaps in their capabilities for, and experience of, handing every data domain (see "MDM Products Remain Immature in Managing Multiple Master Data Domains").
This Magic Quadrant provides insight into the segment of the constantly evolving packaged MDM system market that focuses on managing customer data to support CRM and other customer-related strategies. It positions relevant technology providers on the basis of their Completeness of Vision relative to the market, and their Ability to Execute on that vision.

Magic Quadrant

Figure 1. Magic Quadrant for Master Data Management of Customer Data Solutions
Figure 1.Magic Quadrant for Master Data Management of Customer Data Solutions
Source: Gartner (October 2013)

Vendor Strengths and Cautions

IBM (InfoSphere MDM Advanced Edition)

IBM (www.ibm.com) is headquartered in Armonk, New York, U.S. IBM's InfoSphere MDM Advanced Edition (AE) version 11 achieved general availability (GA) in June 2013. IBM's total MDM software revenue in 2012 (estimated for all products and domains) was $311.6 million, of which $132 million was for AE for customer data. IBM's total MDM customer count in March 2013 (estimated for all products and domains ) was over 800, of which 250 were for AE for customer data.
Strengths
  • Broad information management strategy: At IBM, MDM is central to a much broader big data and information management (IM) strategy and platform. This is attractive for large organizations looking for a wide range of IM capabilities from one vendor.
  • Product strategy and vision: AE is the lead IBM product for customer and multidomain MDM and has strengths in multiple MDM styles. IBM is delivering on the convergence of its legacy products into functional "editions." The included Master Data Governance facility has improving stewardship facilities, and IBM is building capabilities to link MDM and big data.
  • Robust data model and services: AE has a robust, extensible party data model. It can model other domains, and some industry-specific extensions are available. Reference customers gave high scores for industry understanding, governance support, integration and performance.
Cautions
  • Momentum slowing: IBM's overall MDM software revenue growth slowed to an estimated 3.5% in 2012, spread evenly across all data domains, and estimates suggest that revenue growth slowed significantly for AE.
  • Perceived as complex: AE appears in a large number of the client inquiries and competitive situations received and discussed by Gartner, but is often seen as a having a larger technical footprint than its competitors.
  • Reference survey concerns: A below-average number of AE reference customers responded to Gartner's survey (described in Note 2), and they gave below-average scores for total cost of ownership (TCO), workflow and reporting.

IBM (InfoSphere MDM Standard Edition)

IBM (www.ibm.com) is headquartered in Armonk, New York, U.S. IBM's InfoSphere MDM Standard Edition (SE) version 11 achieved GA in June 2013. IBM's total MDM software revenue in 2012 (estimated for all products and domains) was $311.6 million, of which $71 million was for SE for customer data. IBM's total MDM customer count in March 2013 (estimated for all products and domains) was over 800, of which 350 were for SE for customer data.
Strengths
  • Broad IM strategy: At IBM, MDM is central to a much broader big data and information management (IM) strategy and platform. This is attractive for larger organizations looking to source a wide range of IM capabilities from one vendor.
  • Unique offering: SE is a robust product oriented around a registry-based implementation style with an attributed, extensible data model and powerful matching and data management functions; it has a large roster of satisfied clients.
  • Strong performance and industry focus: SE has strong proof points for extremely high volumes of business-to-consumer (B2C) data, with subsecond latency and high transaction rates. SE is very strong in the healthcare market where registry is a common requirement, and it continues to do well in the government sector where complexity in application landscapes lends itself to the registry style.
Cautions
  • Momentum slowing: IBM's overall MDM software revenue growth slowed to an estimated 3.5% in 2012, spread evenly across all data domains, and estimates suggest that revenue growth slowed significantly for SE.
  • Limited implementation: SE is limited to the registry style. Users needing other styles should consider IBM's AE or other vendors' offerings.
  • Reference survey concerns: References gave SE below-average scores for industry understanding, new feature responsiveness, pricing transparency, workflow and reporting.

Informatica

Informatica (www.informatica.com) is headquartered in Redwood City, California, U.S. Informatica's MDM 9.6 achieved GA in June 2013. Informatica's total MDM software revenue in 2012 (estimated) was $85 million, of which $70 million was for customer data. Informatica's total MDM customer count in March 2013 (estimated) was 259, of which 245 were for customer data and 180 were for multiple data domains.
Strengths
  • Multidomain and broad IM capabilities: Informatica MDM is party-data-oriented, but can readily model other data domains. Reference customers cite data model flexibility as its main strength. A planned end-of-2013 release will eliminate database management system stored procedures, providing database independence. Informatica has highly rated data quality and data integration tools.
  • Continued investment: In 2012 and 2013, Informatica acquired Data Scout, now positioned as the Informatica Cloud MDM solution, though this supports only the salesforce.com platform; Heiler Software, an MDM of product data vendor; and Active Endpoints, a vendor of business process management software (BPMS). Informatica also continues to invest substantially in core MDM development.
  • Recovered momentum: Following early missteps, Informatica has recovered to be considered in twice the proportion of competitive situations of any other vendor, as reported by all survey respondents for this Magic Quadrant. At just under 7%, its revenue growth in the customer data market in 2012 was above the market average, though less than in 2011.
Cautions
  • Portfolio strategy: Informatica's "Universal MDM" vision, including its Heiler acquisition, is still emerging. The company must act decisively to avoid having a "disparate MDM products" message used against it by megavendors, which are resolving this issue in their own portfolios, and smaller competitors.
  • Lack of packaged governance technology for MDM: Informatica has opted to market the use of its current product suite to enable master data governance. Organizations that desire a packaged solution to master data governance should ensure they understand Informatica's approach.

Oracle (CDH)

Oracle (www.oracle.com) is headquartered in Redwood Shores, California, U.S. Oracle's Customer Data Hub (CDH) version 12.2 achieved GA in September 2013. Oracle's total MDM software revenue in 2012 (estimated for all products and domains) was $243 million, of which $28 million was for CDH. Oracle's MDM customer count in March 2013 (estimated for all products and domains) was 1,550, of which 360 were for CDH.
Strengths
  • Strong MDM portfolio: Oracle has a broad range of MDM assets for multiple domains and use cases. Revenue growth for MDM of customer data was an estimated 10% in 2012.
  • Good fit for E-Business Suite clients: CDH is sold to users of Oracle's E-Business Suite applications, and appeals to B2B-oriented users and others with modest data volumes.
  • Good packaged data model and improving functionality: Oracle CDH has a rich party data model, derived from the E-Business Suite. CDH is increasingly drawing on more components of Oracle Fusion Middleware and the evolving standard MDM technology platform.
Cautions
  • Restricted positioning: Sales of CDH are generally restricted to E-Business Suite users. Siebel Universal Customer Master (UCM) is Oracle's lead MDM offering for customer data; with Fusion MDM slowly ramping up, CDH has virtually disappeared from our client interactions.
  • Lagging functionality: CDH has fallen behind Siebel UCM and best-in-class vendors in a number of areas, including data quality technology, data governance facilities and hierarchy management.
  • Reference survey concerns: As for 2012's Magic Quadrant, Oracle did not submit references for CDH. In prior years, multiple reference customers reported performance issues when mastering over 100,000 customer records in the hub.

Oracle (Siebel UCM)

Oracle (www.oracle.com) is headquartered in Redwood Shores, California, U.S. Oracle's Siebel Universal Customer Master (UCM) version 8.2 81110FP achieved GA in March 2013. Oracle's MDM software revenue in 2012 (estimated for all products and domains) was $243 million, of which $82 million was for UCM. Oracle's MDM customer count in March 2013 (estimated for all products and domains) was 1,550, of which 330 were for UCM.
Strengths
  • Strong MDM portfolio: Oracle has a range of MDM solutions spanning multiple domains and industries. Revenue growth for MDM of customer data was an estimated 10% in 2012.
  • Lead Oracle offering: UCM is Oracle's lead product for MDM of customer data. New features include Open UI for UCM and improved integration with Oracle's Enterprise Data Quality Management (EDQM) suite. Reference customers awarded high scores for UCM's road map visibility and support for multiple styles of MDM.
  • Strong verticalization and scalability: UCM supports Oracle's Customer Experience (CX) strategy and has versions supporting several industries. It has live transactional workloads managing more than 100 million consumers.
Cautions
  • Unclear direction: Fusion MDM is not sold aggressively for MDM of customer data; prospective customers are uncertain whether to invest in UCM or Fusion MDM. Fusion MDM did not earn sufficient revenue in 2012 to be included in this analysis.
  • Not designed for multidomain: Siebel UCM is based on a packaged party model; although it is robust and extensible, its architecture often excludes it from multidomain evaluations.
  • Requires high-level vendor support: Some reference customers reported that securing access to UCM's product management team at Oracle was a critical success factor for their implementation.

Orchestra Networks

Orchestra Networks (www.orchestranetworks.com, www.smartdatagovernance.com) is headquartered in Paris, France. Orchestra's EBX5 version 5.4 achieved GA in October 2013. Orchestra's MDM software revenue in 2012 (estimated) was $10.7 million, of which $5.3 million was for customer data. Orchestra's MDM customer count in March 2013 (estimated) was 90, of which 37 were for customer data.
Strengths
  • Strong sales momentum: Orchestra's revenue grew by 26% in this market segment in 2012 as it targeted multidomain scenarios, many of which were distinctive within the market. A cloud-based option is available.
  • Robust capabilities: EBX5 has flexible browser-based data modeling facilities. It supports XML-based and relational schemas in a single hub. Reference customers gave EBX5 high scores in almost every category except data quality reporting.
  • Supports specialized scenarios: In addition to reference data and hierarchy management, Orchestra targets specialized multidomain scenarios commonly found in organizations with B2B business models.
Cautions
  • Narrow marketing strategy: By targeting niche scenarios, Orchestra has implicitly ceded mainstream implementations to competitors, when its offering should be quite attractive in those areas.
  • Risky sales strategy: Orchestra often sells MDM solutions to function-specific business users in the belief that these efforts lead to broad IT adoption. This strategy runs a high risk against larger competitors that are more enterprise-oriented.
  • Product and partner strategy: Orchestra needs to develop starter templates (data models and services, UIs, configured rules and metrics) to compete with larger rivals. This requires a mature partnership model; so far, Orchestra has partnered on an opportunistic basis.

SAP (MDG-C)

SAP (www.sap.com) is headquartered in Walldorf, Germany. SAP's Master Data Governance for Customer (MDG-C) version 6.1 achieved GA in December 2012. SAP's MDM of customer data software revenue in 2012 (estimated) was $30 million, of which $25 million was for MDG-C as a stand-alone hub. SAP's MDM customer count in March 2013 (estimated) was 2,300 licenses, of which 1,600 were active, 930 were for customer data and 280 used MDG-C as a stand-alone hub.
Strengths
  • Broad portfolio: SAP sells NetWeaver Master Data Management (for consolidation), MDG (for centralized) and Information Steward for stewardship support. An MDG Enterprise Edition is planned for Hana-based customer data in 2014.
  • Product fit/flexibility: MDG-C is based on the Advanced Business Application Programming (ABAP) programming language, unlike NetWeaver MDM. Users can support ERP data management by implementing MDG "inside" Enterprise Central Component (ECC), or "outside" (but integrated with) ECC as an MDM hub, extending the data model for non-SAP data.
  • Momentum within client base: The share of SAP customer MDM sales attributed to MDG-C grew from 20% in 2011 to 90% in 2012. The largest portion of this 90% is associated with MDG-C operating as an MDM hub, as opposed to directly against an ECC ERP system.
Cautions
  • Sells primarily to SAP's ERP installed base: MDG-C is not sold as a stand-alone or best-of-breed MDM offering. This is therefore a self-imposed niche market segmentation.
  • Narrow implementation style support: MDG-C is not appropriate for the consolidated style of MDM, and NetWeaver MDM is excluded from this year's analysis due to a substantial slowdown in revenue. Until MDG Enterprise Edition becomes available and proven, clients needing consolidation-style MDM face a difficult decision due to the loss of momentum of NetWeaver MDM for customer data and the necessity to include SAP Data Services to support this style of MDM with MDG.
  • Reference survey concerns: Although interest and uptake appear high, SAP identified very few reference customers for MDG-C. This may not be entirely attributable to the product itself, as such a situation often indicates a difficult or complex implementation cycle, frequently involving multiple data domains.

SAS

SAS (www.sas.com) is headquartered in Cary, North Carolina, U.S. SAS's Master Data Management version 3.2 achieved GA in December 2012. SAS's total MDM software revenue in 2012 (estimated) was $8.6 million, of which $4.2 million was for customer data. SAS's MDM customer count in March 2013 (estimated) was 292, of which 134 were for customer data, 78 of which were using SAS Master Data Management.
Strengths
  • Strong internal integration focus: DataFlux qMDM is now SAS Master Data Management, and a clear investment is being made to integrate it with other SAS products, such as Analytics, Business Data Network, Data Governance and Customer Intelligence.
  • Graduated approach: SAS offers an incremental approach: data quality and integration tools for custom builds; batch-based MDM for one domain with Master Data Foundations; and two levels (Standard and Advanced) of SAS Master Data Management.
  • Solid foundation: SAS Master Data Management has a flexible data model that can model multiple data domains, though it has the most experience with customer data. It has excellent data quality and data profiling facilities, and includes a business rule engine.
Cautions
  • Slowing momentum: Revenue growth in 2012 was negligible in a market that grew by 5.4%. Similarly, SAS had little presence in Gartner's client interactions.
  • Internal focus: SAS has recently focused on integrating DataFlux technology into its larger suite, which may leave it behind its MDM competitors in the short term in areas such as industry templates, data visualization and big data.
  • Reference survey concerns: A below-average number of SAS reference customers responded to Gartner's survey, and although SAS received high scores for data quality capabilities and new feature responsiveness, it received low scores for workflow and initial load support, and for scalability. Enhancements in the MDM release planned for the fourth quarter of 2013 appear to target these concerns.

Talend

Talend (www.talend.com) is headquartered in Paris, France and Los Altos, California, U.S. Talend's Platform for Master Data Management version 5.3 achieved GA in June 2013. Talend's total MDM software revenue in 2012 (estimated) was $8.2 million, of which $5.1 million was for customer data. Talend's total MDM customer count in March 2013 (estimated) was 63, of which 38 were for customer data.
Strengths
  • Broad IM vision: Talend has a broad platform, including highly rated data quality and integration tools. It can model multiple data domains in the same product. It acquired enterprise service bus vendor Sopera in 2010 and began an OEM relationship with BPMS vendor BonitaSoft in 2011.
  • Increasing revenue and mind share: Talend earned $5 million from the MDM of customer data market segment in 2012, up from virtually none in 2010. It submitted a full set of survey reference customers, which achieved an above-average response rate, and was cited in 10% of competitive situations by the survey respondents for all vendors.
  • Attractive prices and model: Talend uses a subscription model. Its average selling price is well below the market average, and users can download a free open-source version with limited features.
Cautions
  • Overall profitability: Although it has significant cash reserves and committed investors, Talend does not expect to be profitable until sometime in 2013. This may affect its ability to maintain necessary internal investment, should its planned trajectory not be achieved.
  • Technical orientation: Several clients and survey respondents describe Talend's software as unsuitable for business users. Low scores were given for industry knowledge and road map visibility.
  • Software flexibility and stability: Reference customers reported stability issues and a lack of configurability with Talend's data stewardship UI; however, several gave high marks for Talend's efforts to solve these issues.

Tibco Software

Tibco Software (www.tibco.com) is headquartered in Palo Alto, California, U.S. Tibco's MDM version 8.3 achieved GA in March 2013. Tibco's total MDM software revenue in 2012 (estimated) was $52.8 million, of which $15.2 million was for customer data. Tibco's total MDM customer count in March 2013 (estimated) was 270, of which 106 were for customer data.
Strengths
  • Strong momentum: Tibco's revenue in this market segment grew by an estimated 25% in 2012, and its number of licenses doubled. Tibco continues to draw on its application integration base. It is building a dedicated MDM implementation staff and a set of industry starter templates.
  • Increased presence: Traditionally stronger in product data, Tibco is aggressively targeting the customer data market segment, and seeing results. Though still relatively low, its visibility in competitive situations has also increased.
  • Product strategy: Tibco has solid multidomain and data modeling capabilities; visual MDM is a differentiating feature for data quality reporting with a Spotfire runtime license. Tibco's in-memory caching and use of tibbr for internal "social MDM" and governance are attractive. Reference customers gave Tibco high scores in almost every category.
Cautions
  • Emphasis on IT aspect: Reference customers and users of Gartner's inquiry service report an IT-focused sales and implementation process, with little attention paid to the business ownership aspects of MDM programs. Tibco will need to engage business stakeholders effectively to remain competitive.
  • Failure to market differentiators: Reference customers gave Tibco low scores for data quality reporting, and clients seeking Gartner's advice when evaluating Tibco have not mentioned its visual MDM. Given Tibco's capabilities, this indicates a lack of appropriate marketing or upgrade incentives.
  • Maintenance of focus: Given its rapid growth, Tibco may find it challenging to support MDM of customer data and product data implementations with its current level of experienced resources.

VisionWare

VisionWare (www.visionwareplc.com) is headquartered in Glasgow, Scotland, U.K. VisionWare's MultiVue version 3.2 achieved GA in October 2012. VisionWare's total MDM software revenue in 2012 (estimated) was $5 million, of which $4.7 million was for customer data. VisionWare's total MDM customer count in March 2013 (estimated) was 94, all of which were for customer data.
Strengths
  • Excellent fit for Microsoft users: VisionWare's products are attractive to Microsoft-centric organizations. MultiVue is based solely on Microsoft technologies, such as .NET, SQL Server and BizTalk.
  • Attractive prices and models: VisionWare offers perpetual and subscription licensing and special public sector pricing. Its average selling price is well below the market average for both public and private sectors. Reference customers gave the company high scores for pricing transparency.
  • Solid customer data capabilities: VisionWare has released a product called Auris to perform MDM functions within Microsoft's Dynamics environment, and has included an integration facility for reference data in its latest release. Reference customers gave VisionWare high scores for most standard MDM capabilities.
Cautions
  • Continued flat revenue: VisionWare's revenue showed little or no revenue growth in 2011, and this trend continued in 2012.
  • No multidomain vision: VisionWare has deep experience in domains such as customer and citizen data, and MultiVue can model other domains, but the vendor has not articulated a vision beyond its current niches.
  • Reference survey concerns: VisionWare achieved a low survey response rate and low scores for its rate of technology innovation, data model flexibility and data quality facilities.

Vendors Added and Dropped

We review and adjust our inclusion criteria for Magic Quadrants and MarketScopes as markets change. As a result of these adjustments, the mix of vendors in any Magic Quadrant or MarketScope may change over time. A vendor's appearance in a Magic Quadrant or MarketScope one year and not the next does not necessarily indicate that we have changed our opinion of that vendor. It may be a reflection of a change in the market and, therefore, changed evaluation criteria, or of a change of focus by that vendor.

Added

Talend — Talend is an open-source vendor and its commercial MDM solution, Talend Platform for Master Data Management, uses open-source technology, including the company's own data integration and data quality products. In the past there has been some interest in Talend's free, downloadable Open Studio for MDM, but this year Talend met the inclusion criteria with its commercial offering, based on revenue attributable to MDM of customer data and a full set of responsive implementation references.

Dropped

SAP (NetWeaver MDM) — Although this product is still included in this year's forthcoming "Magic Quadrant for Master Data Management of Product Data Solutions," SAP's emphasis in regard to MDM of customer data has shifted to the newer MDG-C product and the Hana-based MDG Enterprise Edition, the latter currently scheduled, we estimate, for release in 2014. This change in marketing and sales strategy has resulted in a reported revenue level for NetWeaver MDM that is lower than the minimum required for inclusion in this Magic Quadrant.

Inclusion and Exclusion Criteria

Inclusion Criteria

For inclusion in this Magic Quadrant, vendors were required to have:
  • Generated at least $4 million in total software revenue (licenses and maintenance) related to MDM of customer data solutions, primarily supporting operational business processes, in the prior calendar year
  • Active sales and support activities globally — that is, in at least two of the following regions: Americas, Europe, the Middle East and Africa, Asia and Australasia
  • Active sales, support and customers in multiple industries
We also collect and/or estimate additional data to ascertain the level of activity and stability of each vendor in the market, though not as part of the inclusion criteria. We looked for:
  • At least 12 live customer references for MDM of customer data solution functionality
  • At least eight new customers for MDM of customer data solutions in the prior calendar year
  • Sufficient professional services to fulfill customer demand during the next six months
  • Enough cash to fund a year of operations at the current "burn rate" (companies spend their cash reserves if a year of operations is cash-flow-negative)

Multiple Products

Vendors may have multiple products in the MDM of customer data solutions market. Where end users report a notable difference between them, each product is evaluated separately against these inclusion criteria.
On this basis, the following vendors offer multiple products and are evaluated separately:
  • IBM: two products, both qualified and included in the analysis
  • Oracle: three products, two qualified and included in the analysis
  • SAP: two products, one qualified and included in the analysis
The following vendors offer multiple products, but some of these products did not qualify for inclusion and are therefore not analyzed other than from the perspective of being of strategic importance to a vendor's MDM product strategy:
  • Oracle: Fusion Customer Hub did not meet the inclusion criteria for revenue
  • SAP: NetWeaver MDM no longer meets the inclusion criteria for revenue attributable to MDM of customer data solutions and has therefore been dropped

Exclusion Criteria

This Magic Quadrant excludes the following because they are either tangential to the main focus of MDM programs (mastering data within the organization) or so new that they have yet to affect on-premises MDM deployments:
  • Vendors that focus solely on analytical (downstream) MDM requirements. We use only revenue from operational MDM installations for qualification, since this is where the bulk of MDM effort goes.
  • Vendors reselling another vendor's MDM of customer data solution without extending its functionality. Likewise, royalties from an OEM or resale by another vendor are not credited to the provider of the OEM technology; original software revenue from the end-user acquisition is credited to the reselling vendor.
  • Hosted and cloud-based services, marketing service providers and data providers that provide trusted reference data external to the enterprise but do not provide an MDM of customer data solution that specifically meets Gartner's definition.

MDM of Customer Data Solution Product Description

This market is characterized by packaged software solutions that bring together a range of technologies and capabilities that help sustain the idea of a "single golden record" for customer master data. This is the primary focus of this analysis. The range of functional capabilities included in these products includes:
  • Data modeling capabilities — The applicability of the data model to your organization is a fundamental requirement. It must:
    • Model the complex relationships between the application sources inside the organization and its products and services, as well as with intermediaries and other parties, with the ability to handle complex hierarchies.
    • Map to the master customer information requirements of the entire organization.
    • Be configurable, customizable and extensible, but also upgradable.
    • Support industry-specific requirements, as well as multiple hierarchical and aggregated views associated with customer data structures related to consumer systems, and so on. This is particularly important across operational and analytical MDM requirements.
    • Provide a base for the required workload mix and level of performance.
    • Be expressed using commonly accepted logical data model conventions with associated metadata.
    • Manage data, business rules, sources, ownership and so on for data governed by the MDM program using flexible, dynamic and business-consumable metadata management capabilities.
  • Information quality management/semantic capabilities — A good data model is of little value unless it contains accurate, up-to-date and semantically consistent data for a customer. The MDM of customer data solution should:
    • Have strong facilities, in batch and real-time modes, for profiling, cleansing, matching, linking, identifying and semantically reconciling customer master data in different data sources to create and maintain a "golden record." These facilities may be provided by the MDM of customer data solution vendor or by offering tight integration with products from specialist data quality partners.
    • Configure business and data rules for comparing, reconciling and enforcing semantics across data sources, matching and linking the data, and managing the merging and unmerging of records with support for full auditability, survivability and data lineage.
    • Ensure that business, rules and associated metadata related to data cleansing is sufficiently visible to satisfy compliance requirements.
  • Business services, integration and synchronization capabilities — The MDM of customer data solution needs to provide facilities for loading customer data in a fast, efficient and accurate manner. There will also be a need for integration middleware, including publish and subscribe mechanisms, to provide a communication backbone for the bidirectional flow of customer data between the central repository and the spoke systems, be they copies or subsets of the repository, or remote applications (coexistence style). Many organizations will also plan to use the new customer master database as the basis for new operational (both transaction and workflow-oriented) and analytical applications. In the service-oriented architecture (SOA) world of enterprise architecture, service-oriented composite business applications may consume MDM of customer data solution business services through Web services' standard interfaces.
  • These facilities may be provided by the MDM of customer data solution vendor or through tight integration with products from specialist middleware partners. The MDM of customer data solution should support, as necessary, the MDM implementation styles, which each use loading, integration and synchronization in different ways, by being able to:
    • Leverage a range of middleware products to connect to data sources, including legacy data sources, and expose industry-standard interfaces.
    • Support integration with different latency characteristics and styles — for example, real time and batch.
    • Support integration with downstream business intelligence and analytical requirements.
    • Flexible and comprehensive business-services-based capability in order to model data services as well as user interactions across applications and data stores where master data is stored and used.
  • Business process management (BPM) and workflow design and management capabilities — Customer master data will permeate a range of business applications across systems and geographies. Successful MDM programs require a strong, business-outcome-driven process understanding of where and when master data is required in order to ensure the integrity of business processes. MDM of customer data solutions do not need to include BPMS technology, but they do need to interoperate with third-party BPMS solutions in order for their stewardship (enforcement) and integration (services) capabilities to be consumed in actual business process orchestrations. A suggested range of necessary capabilities includes ones to:
    • Model, consider or recognize a business process model at a conceptual, logical and physical level in order to identify a conceptual, logical and physical data model in support of the same.
    • Document and understand — that is, diagnose — the flow of master data across business systems, applications and processes.
    • Design, orchestrate and manage a business-level and data-level workflow between any MDM hub and business systems that subscribe to the necessary information infrastructure.
    • Support analytics, key performance indicators and benchmarking for an "as is" version of business processes and their outcomes, as well as workflows within them; also, to support a "to be" version for business process and data models.
  • Performance, scalability, availability and security capabilities — If the MDM of customer data solution supports operational and analytical applications and is tightly integrated with established systems and new applications, serious demands are likely to be made on its performance, scalability and availability. The MDM of customer data solution should have:
    • Proof points, preferably through live references, of different aspects of performance and scalability that match your current and future requirements.
    • Appropriate availability characteristics regarding planned and unplanned downtime.
    • On the security and data privacy management front there should be the ability to:
    • Manage the policies and rules associated with potentially complex privacy access rights.
    • Configure and manage different rules of visibility, providing different views for different roles.
  • Stewardship support and services — The MDM of customer data solution needs to support a range of capabilities, from information policy evaluation through to the day-to-day operation and management of MDM. Governance roles focus on policy setting, steward roles on policy enforcement. The resulting focus of this functionality will be the role of the (business-led) data steward and governance roles. Among the different user roles that interact with MDM, the data steward and governance roles require a suitable UI whereby these services are provided. These services will include, but not be limited to:
    • Design and impact assessment of information policy pertaining to business or systemwide authority for data.
    • Analytics and performance measures related to a range of processes and activities taking place within MDM, from running batch data loads to executing workflows against benchmarks, assessing the quality of active master data, running business process benchmarks, and measuring the business value provided by MDM.
    • Status and management tools for the steward and governance roles to monitor to-do lists of users to ensure effective action takes place across the MDM landscape.
    • Systemwide master/meta models to help identify which users, roles, applications and systems are responsible for which master data, and the state of the master data and/or business rules that are generating exceptions in that data.
    • Workflow services to interrogate and provide revisions to current MDM workflows.
    • Business rules services to interrogate which rules are used by MDM and provide suggested enhancements to such business rules; these are also used to determine under which circumstances source preference is revised to give preference to the most dependable source.
    • Full, business-consumable audit trail information to identify past changes to information.
    • A range of user interfaces on PCs, smartphones and tablets.
  • Technology and architecture considerations — MDM of customer data solutions should be based on up-to-date, mainstream server, PC and mobile device technologies, and be capable of flexible and effective integration with a wide range of other application and infrastructure platform components — whether from the same vendor or not — within end-user organizations.
  • An MDM of customer data solution should be capable of:
    • Flexible configuration into a range of implementation styles in terms of instantiation, latency and use of customer master data to enable it to satisfy different use case scenarios, such as the consolidation, registry, coexistence and centralized scenarios.
    • Architecturally supporting global rollouts and localized international installations.
    • Supporting both on-premises and cloud deployment styles, including SaaS.
    • Supporting integration with big data sources, such as social networks, and performing entity resolution within those sources, whether relational or nonrelational, and whether data is structured or unstructured.

Evaluation Criteria

Ability to Execute

Gartner analysts evaluate technology providers on the quality and efficacy of the processes, systems, methods or procedures that enable IT providers' performance to be competitive, efficient and effective, and to positively impact revenue, retention and reputation. Ultimately, technology providers are judged on their ability and success in capitalizing on their vision.
Product or Service: Software products offered by the vendor that compete in/serve the MDM of customer data solutions market segment. This includes product capabilities, quality, feature sets, skills and so on, whether offered natively or through OEM agreements and partnerships as defined in the market definition and detailed in the subcriteria.
Vendors are measured on the ability of their products to support the following MDM of customer data solution subcriteria:
  • Data modeling capabilities
  • Information quality and semantic capabilities
  • Business services, integration and synchronization
  • Workflow and BPM capabilities
  • Performance, scalability, security and availability capabilities
  • Stewardship support and services
  • Technology and architectural considerations
Overall Viability: Viability includes an assessment of the MDM of customer data solution vendor's financial health, the financial and practical success of the business unit or organization in generating business results in the MDM of customer data solutions market segment on a global basis, and the likelihood that the organization or individual business unit will to continue to invest in development of the product, offer the product and advance the state of the art within the organization's product portfolio.
Sales Execution: A vendor's capabilities in all MDM of customer data solutions-related presales activities on a global basis, and the structure that supports them. This includes deal management, pricing and negotiation, presales support and the overall effectiveness of the sales channel.
Market Responsiveness and Track Record: Ability to respond, change direction, be flexible and achieve competitive success as opportunities develop, competitors act, customers' needs evolve and market dynamics change within the MDM of customer data solutions market segment. This criterion also considers the vendor's history of responsiveness.
Marketing Execution: The clarity, quality, creativity and efficacy of programs designed to deliver the vendor's message on a global basis, in order to influence the MDM of customer data solutions market segment, promote the vendor's brand and business, increase awareness of its products, and establish a positive identification with its product/brand and organization in the minds of buyers. This "mind share" can be driven by a combination of publicity, promotional, thought leadership, word-of-mouth and sales activities.
Customer Experience: Relationships, products and services/programs that enable clients to be successful on a global basis with the products evaluated. This includes implementation and support and the way customers receive technical and account support. It also includes measures of clients' success in implementing MDM for customer data solutions: customer references and TCO.
With the increasing hype about multidomain MDM, we also look for demonstrated proof — via proof of concepts, customer evaluations and live implementations — of multidomain/multiprovince capability.
Operations: The provider's ability to meet its goals and commitments. Factors include the quality of the organizational structure, including skills, experiences, programs, systems and other vehicles that enable the organization to operate effectively and efficiently on an ongoing basis. This criterion was not explicitly rated, but was rolled into the Overall Viability, Sales Execution/Pricing and Marketing Execution criteria.
Table 1. Ability to Execute Evaluation Criteria
Criteria
Weight
Product or Service
High
Overall Viability
High
Sales Execution/Pricing
High
Market Responsiveness/Record
High
Marketing Execution
High
Customer Experience
High
Operations
Low
Source: Gartner (October 2013)

Completeness of Vision

Gartner analysts evaluate technology providers on their ability to convincingly articulate logical statements about current and future market direction, innovation, customer needs and competitive forces, and how well they map to Gartner's position. Ultimately, technology providers are rated on their understanding of how market forces can be exploited to create opportunity for the provider.
Market Understanding: A vendor's ability to understand buyers' needs and translate these needs into products and services. Vendors that show the highest degree of vision listen and understand buyers' wants and needs, and can shape or enhance those wants with their added vision. Vendors should demonstrate a strategic understanding of MDM for customer data solution opportunities (for example, new application functionality or customer segments) and ongoing vendor market dynamics (for example, consolidation trends) on a global basis, and translate that understanding into products and services. Additionally, we consider a vendor's understanding of the wider implications of, and position of MDM in relation to, other kinds of master data within an organization's multidomain, multiuse case and multi-implementation style program; an understanding of the relationship to enterprise information architecture and EIM initiatives is also valuable for customers taking a strategic view.
Marketing Strategy: A clear, differentiated set of MDM of customer data solution messages consistently communicated throughout the organization and externalized globally through a website, advertising, customer programs and positioning statements. Intersection with multidomain MDM and wider MDM and industry challenges, as expressed by Gartner clients, is important.
Sales Strategy: A vendor's strategy for selling an MDM of customer data solution that uses its, or a partner's, global network of direct and indirect sales, marketing, service and communication affiliates to extend the scope and depth of its market reach, skills, expertise, technologies, services and customer base.
Offering (Product) Strategy: A vendor's approach to product development and delivery, which should emphasize differentiation, functionality, methodology and feature set as they map to current and future requirements. A vendor's published "statement of direction" (or Gartner's understanding thereof) for the next two product releases needs to keep pace with or surpass Gartner's vision for the MDM of customer data solution market segment. Gartner's main product-oriented criteria focus on:
  • Data modeling capabilities
  • Information quality and semantic capabilities
  • Business services, integration and synchronization
  • Workflow and BPM capabilities
  • Performance, scalability, security and availability capabilities
  • Stewardship support and services
  • Technology and architectural considerations
Each vendor needs to offer an MDM of customer data solution that can be configured into a range of architectural styles, in terms of instantiation, latency, search and usage of customer master data, to enable it to satisfy different use case scenarios, such as the consolidation, registry and centralized style scenarios, and leading to hybrid models such as the coexistence style.
Each vendor needs to show how its MDM of customer data solution supports a wide range of use cases, from business design (construction-centric MDM) to business operations (operational MDM) and business intelligence (analytical MDM). Most vendors focus on one use case, so they need to demonstrate how they intend to support the growing convergence in requirements across use cases.
Each vendor must also understand major technological and architectural shifts in the market, and communicate a plan to leverage them, including migration issues that may affect customers on current releases. Specifically, the vendor should have a vision to support mainstream software infrastructure technology, as opposed to a proprietary stack, and have an evolutionary path toward SOA.
Business Model: The soundness and logic of an MDM of customer data solution vendor's underlying business proposition. Vendors should have a well-articulated strategy for revenue growth and sustained profitability. Key elements of strategy include the sales and distribution plan, internal investment priority and timing, and partner alliances, such as with external service providers.
Vertical/Industry Strategy: A vendor's strategy to direct resources, skills and offerings to meet the specific needs of individual market segments, including industries. Included are reviews of the vendor's strategy for meeting the needs of specific industries, such as banking, manufacturing, communications and government.
Innovation: Vendors need to be able to lead this market and, in so doing, provide customers with an innovative solution and approach to meet customers' needs in a complex, heterogeneous environment. Innovation implies leading the way with MDM of customer data issues both today and in the future. We looked for understanding of, and support for, the most complex and broadest MDM of customer data environments and the growing requirements of multidomain and multi-use-case MDM in general. New this year is a focus on how vendors plan to support key initiatives such as the cloud, social data and other kinds of big data, and mobile communications in the context of MDM.
Geographic Strategy: A vendor's strategy to direct resources, skills and offerings to meet the specific needs of geographies outside its native geography, either directly or through partners, channels and subsidiaries, as appropriate for that geography and market. This includes sales, marketing and support for complex global companies.
Table 2. Completeness of Vision Evaluation Criteria
Evaluation Criteria
Weighting
Market Understanding
High
Marketing Strategy
High
Sales Strategy
Medium
Offering (Product) Strategy
High
Business Model
Medium
Vertical/Industry Strategy
High
Innovation
High
Geographic Strategy
Medium
Source: Gartner (October 2013)

Quadrant Descriptions

Leaders

Vendors in the Leaders quadrant have strong results and strong delivery capabilities, and will continue to have them. They typically possess a large, satisfied customer base (relative to the size of the market) and enjoy high visibility in the market. Their size and financial strength enable them to remain viable in a challenging economy. Leaders have mature offerings and track records of successful deployments, even in the most challenging environments, across all geographies and in many industries. Leaders have the strategic vision to address evolving client requirements; however, they are not always the best choice.

Challengers

Challengers demonstrate a clear understanding of today's MDM of customer data solutions market segment, but they have either not demonstrated a clear understanding of the market's direction or are not well-positioned to capitalize on emerging trends. They often have a strong market presence in other application areas.
There are no Challengers in 2013's Magic Quadrant. The MDM of customer data solutions market segment is increasingly being impacted by the gradual formation of requirements centered on multidomain MDM — in other words, single solutions that can be used for any number of data domains. This influence was very slight five years ago. Every year it has increased, however, and as a result the positions of vendors in this year's Magic Quadrant have been "elongated" from lower left to upper right in Figure 1. Assuming the multidomain MDM market emerges, the MDM of customer data solutions market segment may no longer need to meet those multidomain requirements, so the requirements of future issues of this Magic Quadrant may focus more on the single domain, in which case the positions of the vendors in Figure 1 are likely to spread out. The effect of the current level of ancillary interest in multidomain capabilities by users of MDM of customer data solutions can be seen in this year's Magic Quadrant reference survey, where 43% of respondents voiced interest in noncustomer domains, but only 29% actually formally evaluated those capabilities prior to purchase.

Visionaries

Visionaries display healthy innovation and a strong potential to influence the direction of the MDM of customer data solutions market segment, but they are limited in execution or demonstrated track records. Typically, their products and market presence are not yet complete or established enough to merit Leader status. There are no Visionaries in this year's Magic Quadrant.

Niche Players

Niche Players do well in specific segments of the MDM of customer data solutions market segment, or have a limited ability to be innovative or outperform other vendors in this segment. They may be focused on a specific functionality, domain or industry, or have gaps in relation to broader functionality requirements. Niche Players may have limited implementation and support services, or they may not have achieved the scale necessary to solidify their market positions.

Context

This Magic Quadrant offers insight into the part of the packaged MDM solution market that focuses on how organizations master and share a "single version" of customer data with multiple views of it across their organizations — achieving a single version of master data is a key initiative for many organizations. In this Magic Quadrant "customer data" is defined as including consumers, business customers, channel/trading partners, prospective customers, citizens, constituents, people of interest, healthcare professionals, patients and counterparties; it excludes other parties, such as human resources and suppliers. This analysis positions MDM of customer data solution vendors (and their offerings) on the basis of their Completeness of Vision relative to the market segment, and their Ability to Execute on that vision.
Use this Magic Quadrant to understand the MDM of customer data solutions market segment, and how Gartner rates the vendors (and their offerings) in this segment. Study this research to evaluate vendors by a set of objective criteria that you can adapt to your particular situation. Gartner advises organizations against simply selecting vendors in the Leaders quadrant. All selections should be buyer-specific, so vendors from the Challengers, Niche Players and Visionaries quadrants might be better matches for your requirements. See "How Gartner Evaluates Vendors and Markets in Magic Quadrants and MarketScopes."
Although important, selecting an MDM for customer data solution is only part of the MDM challenge. To succeed, you should put together a balanced MDM program that creates a shared vision and strategy, addresses governance and organizational issues, uses appropriate technology and architecture, and creates the necessary processes and metrics for your customer data system (see "The Seven Building Blocks of MDM: A Framework for Success" and "The Five Vectors of Complexity That Define Your MDM Strategy").

Market Overview

The Need for a Single View of the Customer

Business drivers for creating a single view of the customer include:
  • Compliance and risk management drivers, such as "know your customer," anti-money-laundering and counterparty risk management in the banking sector, and Sunshine Act compliance in the life sciences sector. Associated initiatives tend to have concrete benefits and they are mandatory.
  • Cost optimization and efficiency drivers. Very often these drivers are associated with business transformation initiatives and end-to-end business process improvement programs. These have tangible benefits and are a good fit for organizations' needs during an economic downturn.
  • Revenue and profitability growth drivers. Examples are initiatives to improve cross-selling, upselling and retention. CEOs, chief marketing officers and CIOs are placing increased emphasis on improving the customer experience through an accurate and complete understanding of customers' interactions with their enterprises. These drivers can be more difficult to measure, but are a major focus when the economy is going well.
However, most large enterprises have heterogeneous application and information management portfolios, with fragments of often inaccurate, incomplete and inconsistent data residing in various application silos. No single system contains this single view of the customer or is designed to manage the complete life cycle of customer master data.
The ability to create, maintain and draw on a single, trusted, shareable version of customer master data is increasingly seen as an essential requirement in commercial and noncommercial organizations to support business processes and business decision making. When creating and managing customer master data, many organizations and vendors originally thought that CRM, ERP or industry application systems would solve the problem of inconsistent master data spread across multiple systems; however, CRM, ERP and industry systems were not designed for that task, and often there are multiple CRM or ERP systems in an enterprise. Many organizations have now invested in creating a new central system to master their customer data, with the majority (an estimated 80%) of organizations buying packaged MDM of customer data solutions, as opposed to building the capability themselves.
Organizations in different industries have different business models, and therefore their MDM efforts vary (see "The Five Vectors of Complexity That Define Your MDM Strategy"). Some organizations have a customer base of millions of consumers, such as high-volume B2C organizations. Others have a base of thousands or tens of thousands of customers, such as lower-volume, but more complex, B2B organizations. This has implications for the MDM implementation style (see "The Important Characteristics of the MDM Implementation Style").
In a high-volume B2C organization, customer data is typically authored in a distributed fashion in existing applications. In this case, the MDM "journey" may start with either registry-style indexing in the central hub or a physical consolidation into the central hub, potentially followed by publishing from the hub to harmonize the different application systems in a coexistence style. Some organizations reach their intended goal by coupling hub-and-spoke systems more tightly with transactional access to the hub where central authoring takes place. The B2B requirement also often leads to central authoring, but on the basis of a collaborative workflow.
Our Magic Quadrant reference survey (see Note 2) found that 29% of respondents followed the centralized approach in 2013, up from 20% in 2012. The coexistence style, where the authority model is shared between the MDM hub and its source operational systems, was adopted by 13% of respondents, up from 12% in 2012. The consolidated style of MDM hub was reported by 40% of the respondents in 2013, up from 36% in 2012. Their customer master data store contained a reconciled copy of the master data from other authoritative sources. The percentage of those following the registry approach — with their customer master data store consisting only of an index to the master data in other authoritative sources — fell to 4%, from 9% in 2012. Hybrid approaches were reported by 13% of respondents, down from 21% in 2012.

The Market Is Maturing Steadily but Still Has Some Way to Go

Momentum has been steadily building in this market during the past 10 years, during which time MDM vendors have sold over 2,500 copies of their MDM of customer data solutions. Moreover, MDM vendors that traditionally sold lead products in other disciplines (such as data quality or data integration) are now widely reporting sales driven primarily by MDM, with "pull-through" of other products in the same deals. In addition, the continued strong growth of the CRM market (see "Forecast: Enterprise Software Markets, Worldwide, 2012-2017, 2Q13 Update") bodes well for the future of MDM of customer data, as MDM commonly lags CRM implementations by a few years — weakly managed CRM data quality can result in operational difficulties (such as salespeople redundantly calling the same prospective customer) that ultimately require an MDM implementation to solve.
However, our survey for this Magic Quadrant found that the proportion of organizations that described their use of enterprisewide MDM of customer data as "well established" was down by 5% from 2012; the proportion that said they were "working toward" enterprisewide MDM of customer data held steady; and the proportion that described themselves as "having good MDM capabilities in some areas" grew by 9%.
Despite its momentum, the market is still characterized by much immaturity. During interactions with users of Gartner's client enquiry service and one-on-one meetings with clients at Gartner events, approximately 40% of organizations have said they are just starting their MDM programs. Additionally, vendors of MDM of customer data solutions are still expanding their products in different directions, and new players continue to enter the market.
In "Hype Cycle for Enterprise Information Management, 2013," we place MDM of customer data in the Trough of Disillusionment. This is actually a positive sign, as it shows that MDM of customer data is well past the initial hype and early adopter implementations, and is steadily gaining maturity, although it is not yet fully mature.

Vendors Are Investing in Data Stewardship and Governance Technology

In terms of new MDM capabilities, vendors have been placing particular emphasis on adding or improving data stewardship and governance facilities, including data profiling, workflow, data visualization and manipulation, dashboards and reporting. In 2012, they introduced better UIs and workflows for business users, making greater use of business process management technology and MDM applets, which allow existing applications to use MDM-hub-based data.
Across every aspect of MDM — product, customer and multidomain — stewardship tools are turning into solutions called information stewardship applications. This is an exciting trend as it shows the relevance of master data to business users in terms of business value and impact (see "The Emergence of Information Stewardship Applications for Master Data"). Organizations are applying stewardship applications more often to business data attributes beyond those in master data domains, to support data stewards in their data governance activities across varied data management initiatives. Although demand is still emerging, it is clear that early-adopter organizations recognize the value of these capabilities.

The Nexus of Forces Creates Both Opportunities and Risks

Gartner calls the growing convergence of cloud, social networking, mobility and information trends the "Nexus of Forces" (see "The Nexus of Forces: Social, Mobile, Cloud and Information"). Organizations in the customer data MDM market are keeping a close eye on opportunities in these areas. Most organizations recognize there will be both a cultural and a technological shift, but are struggling to understand the impact this will have on their information environments and the required strategic responses (see "The Impact of Social and Other 'Big Data' on Master Data Management").
Organizations recognize that MDM is critical for accurate sentiment analysis in social networks, but it may take longer to implement than they are willing to wait. They also understand that it is more damaging to send unsuitable or redundant retail offers to a customer's smartphone than it is to send the same offers via postal mail — here, again, the role of MDM comes to the fore.
As expected, organizations have shown a high level of interest in social media and mobility as they relate to MDM of customer data. But they have shown a low level of interest in the cloud in connection with MDM in general. There are several reasons for this (see "Hype Cycle for Information Infrastructure, 2013"). Some organizations are using cloud MDM hub services in limited cases, such as rapid proofs of concept; however, the overwhelming preference is currently for on-premises implementations.
Some vendors recognize the need to move from a "single view of customer truth" to a "single view of customer trust" and are establishing relevant product strategies to reflect this. But user organizations typically do not know how to react. As such, we believe there is a high risk that some user organizations will follow a tactical, technological (vendor-led) response to the impact of the Nexus of Forces, rather than a strategic one that would be of greater benefit to their business.

Market Growth Continues and Several Portfolios Remain Complex

Gartner estimates that total software revenue for packaged MDM solutions was $1.6 billion in 2012, an increase of 7.8% from 2011, as compared with a 4.7% rise for the overall enterprise software market (see "Forecast: Enterprise Software Markets, Worldwide, 2012-2017, 2Q13 Update"). Within these overall figures we estimate that the MDM of customer data solutions market segment was worth $527 million in 2012, an increase of 5.4% from 2011. In "Forecast: Master Data Management, Worldwide, 2010-2015," we projected a five-year compound annual growth rate of nearly 20% for both the overall MDM software market and the MDM of customer data software market segment through 2015.
We estimate that IBM is the market share leader in the MDM of customer data solutions market segment (based on sales of InfoSphere MDM SE and AE), with estimated total software revenue of $203 million in 2012. Oracle is in second place (based on sales of its Oracle CDH, Oracle Fusion Customer Hub and Siebel UCM products) with estimated revenue of $114 million in 2012. Informatica is in third place with estimated revenue of $70 million in 2012. SAP is in fourth place with estimated revenue of $30 million (based on sales of NetWeaver MDM and stand-alone hub deployments of MDG-C) in 2012. Tibco is in fifth place with estimated revenue of $16 million in 2012. Together, we estimate that these five market share leaders account for over 80% of the MDM of customer data solutions market segment.
Unlike earlier years, the past year has not been characterized by acquisitions, except for Informatica's acquisition of Data Scout (and subsequently, product information management vendor Heiler and BPMS vendor Active Endpoints). But we continue to see the after-effects of acquisitions: the larger vendors continue to promote and execute convergence road maps to integrate formerly disparate product and technology mixes.
Investment in MDM of customer data solutions continues to occur across all industries, including the government sector. Service industries (such as financial services and healthcare) and governments tend to focus on the customer data domain (except for some sectors of financial services that deal heavily with securities), whereas product-oriented industries tend to be interested in a wide set of data domains (such as product, supplier and customer). There is global interest and investment in MDM of customer data solutions, mainly by large enterprises.
The MDM portfolios of the megavendors (IBM, Oracle and SAP) remain complex. This is largely a result of them trying to meet the initial multidomain demands of the MDM market. IBM continues to focus strongly on a convergence road map for its multiple products, and has begun substantial development relating to its vision for MDM linked with big data sources such as social networks.
Oracle is also converging onto common middleware and MDM technology infrastructure, though sales execution and consequently market uptake of its Fusion MDM platform have been slow.
SAP now has two products in this market segment — NetWeaver MDM and MDG-C — with a third, MDG Enterprise Edition, planned for 2014 for consolidation-style MDM, replacing the planned development of the Master Data Services (MDS) product. The capabilities originally planned for stand-alone MDS are being integrated with MDG capabilities and will be delivered as part of MDG Enterprise Edition.
The smaller vendors have continued to make progress in diverse ways:
  • Informatica has completed its acquisition of Data Scout for MDM of customer data stored in salesforce.com, and has produced versions of this software (now known as Informatica Cloud MDM) incorporating functionality from its on-premises MDM product. Additionally, in 2013, Informatica acquired Heiler Software, a vendor of MDM of product data solutions, to strengthen its multidomain credentials.
  • Tibco Software continues steadily to increase its emphasis on MDM and is becoming more of a force in the MDM of customer data market segment.
  • SAS is taking concrete steps to integrate the former DataFlux MDM solution with its broader suite of SAS data management products.
  • VisionWare continues to provide a distinct Microsoft-based value proposition but remains focused on a limited set of industries, such as healthcare and the public sector.
  • Orchestra Networks, which appeared in the Magic Quadrant for MDM of customer data solutions for the first time in 2012, is growing strongly in some targeted scenarios.
  • Talend appears in this Magic Quadrant for the first time this year, after fully meeting all the inclusion criteria for revenue and implementation references.
Other vendors, such as Ataccama, Information Builders, Kalido, Software AG and Teradata, are also active in this market segment, but their presence (though increasing in some cases) is not large enough in one or more respects for them to be included in this Magic Quadrant. Microsoft has not yet had a major impact on the MDM market with its SQL Server Master Data Services (MDS) technology, other than supporting end users' plans to "build" their own MDM solutions or being incorporated into third-party channel partners' solutions (for example, those of Profisee). While Microsoft's MDS toolset provides several capabilities expected of vendors in this Magic Quadrant, it does not provide the degree of out-of-the-box integration between those capabilities that is typical of an MDM software solution.
Vendors that previously focused on managing product data, such as Riversand and Stibo Systems, started to adopt a more multidomain position in 2012 and to become more relevant to the MDM of customer data solutions market segment. Several of these vendors have developed an MDM of customer data implementation at one or more of their existing clients, capitalizing on established relationships. However, the revenue attributable to these efforts has yet to meet the inclusion criteria for this Magic Quadrant.
There are many other vendors, some small, innovating in and around the field of MDM of customer data. Semarchy is a small French vendor focused on helping clients with an "evolutionary" approach to scaling MDM. Collibra is another small vendor, one that focuses more on the information stewardship side of MDM. Pitney Bowes has built an MDM solution based on the graph database paradigm and emphasizing geospatial capabilities. Dell Boomi has introduced a solely cloud-based platform. These and other vendors show that this market segment is vibrant and constantly evolving. We expect to see more acquisitions and new entrants in the next few years.
Although the overall view of this year's Magic Quadrant appears similar to that of a mature software market, the complex nature of the MDM discipline has led to a situation in which there are still vendors entering the Magic Quadrant for the first time, and others potentially approaching entry. In addition, many vendors are now branching out to manage additional master data domains. Additionally, existing clients and prospective customers have become more educated about the depth and complexity of the expertise and management required by successful MDM implementations, and would seem more likely to rely on a market leader. However, there has also been a noticeable increase in interest during Gartner's interactions with clients (particularly at our MDM conferences) in vendors that can provide specific capabilities at more modest prices. As the overall market continues to grow, there are likely to be sufficient revenue opportunities to keep the Niche Players viable, but they will continue to face a challenge to enter the Leaders quadrant (or even the Challengers quadrant) due to the mind share of the current Leaders.
The current absence of Challengers and Visionaries from this market segment is the result of multiple influences. The Leaders are successfully investing both in solidifying their offerings and creating synergies with ancillary markets such as social network data. This has caused several Niche Players to either define very targeted visions for their products in an as-yet-unfulfilled attempt to overtake the Leaders in those areas (thereby inhibiting their Completeness of Vision), or to imitate the investment behavior of the Leaders at the expense of their Ability to Execute. As these strategies unfold, and the requirements for multidomain capabilities mature (as described in the Challengers quadrant definition), the current vendors will spread into the Challengers and Visionaries quadrants.