sobota, sierpnia 28, 2004

Schyłek i upadek imperium Wintel

Schyłek i upadek imperium Wintel
Sytuacja firm Microsoft i Intel patrząc obiektywnie jest zła i przypomina to schyłek Imperium Rzymskiego. Mimo tego zarząd obu firm tonie w nieświadomości tego faktu, ciesząc się obecną perspektywą sukcesu i niejasną przyszłością. Nazywa się to podstępną chorobą paraliżu spowodowanego odczuciem własnego uprzywilejowania. Wspomina już o tym książka
Edward Gibbon'a "The Decline and Fall of the Roman Empire" w której jest opisana droga schyłkowa imperium spowodowana zmianą priorytetów władców imperium z zasad zarządzania (utrzymania sprawnej władzy) na dążenie do osiągnięcia własnych korzyści. Dążenie do zagarnięcia coraz większych obszarów przesłoniło ich zdolność kontroli nowych ziem (różnorodność plemion i warunków geograficznych). Ten problem ekspansji widział to już cesarz August ale jego ostrzeżenia zostały zbagatelizowane.
Teraz proszę się skupić, w przypadku MS jest podobnie. Mimo zbliżających się na horyzoncie zwiastunów kłopotów (Linux jako serwer i/lub desktop, wprowadzenie Open Office, atak open source), firma Microsoft rzuca się jak oszalała w wir atakowania nowych frontów: nowy system operacyjny Longhorn (i nowe, pionierskie technologie w nim zawarte), nowa wersja Windows 2003 Server na platformie 64 bity, nowa baza danych (MS SQL Server 2005, nowe narzędzia programistyczne (.NET Studio 2005), nowe inicjatywa wydajnego przetwarzania High Performation Server, inicjatywy w zakresie gier konsolowych (Xbox), telefonów komórkowych, domowych centrów rozrywki.
Jednocześnie powstają nowe królewstwa: Linux i Google.
Można przyrównać Longhorn do Windows 95: wielokrotnie odkładano jego start, ale wejście na rynek może oznaczać rewolucję.
Od 18 miesięcy MS prowadzi inicjatywę międzynarodową mającą na celu zahamowanie ekspansji Linuxa w krajach rozwijających się gdzie ponad 600 przedstawicieli MS pomaga agencjom rządowym w obszarze edukacji informatycznej, handlu i ochronie dóbr intelektualnych (uczą oni jak założyć lokalne firmy software'owe). Ciężko zignorować starania i potencjał i kompetencje pracowników MS i Intela, ale problem wielkości firmy pozostaje. Ale jak się widzi próby przyśpieszenia wykonania projektów poprzez dodanie nowych pracowników do każdego projektu. Taka strategia dawno osiągnęła punkt ujemnego stopniowego ulepszania (czyli nie daje efektu a nawet szkodzi).
W Rzymie coraz więcej rządzą okoliczności niż talent cesarza. Śmierć Toedozjusza w 395 r n.e. rozczłonkowała państwo na dwie części miedzy jego synów Arcadius i Honorius. Ten ostatnie nie miał pasji ani talentu do władzy. Zrzekł się codziennych obowiązków cesarza i karmił kury na podwórku, a władzę oddał zaufanemu urzędnikowi.
Dlatego patrzcie uważnie na ten dzień kiedy Alan Balmer zacznie hodować kurki.

Opoźnienia w Longhorn

Niedawno (27.08.04) MS ogłosił, że zmienia plany wydania nowego systemu operacyjnego Longhorn który ma się ukazać w 2006. Z trzech podsystemów WinFS (nowa organizacja systemu plików oparta na bazie danych Yukon mająca ułatwić wyszukiwanie informacji) ma się ukazywać w wersji beta, natomiast Avalon (nowe podejście do grafiki oraz trójwymiarowy interfejs) oraz Ingio (nowe usługi sieciowe - Web Services) mają wejść już w pełnych wersjach do Windows XP i 2003. Problemy z dotrzymaniem harmonogramu wydania Longhorn spowodowane są nie tylko opóźnieniami związanymi z przygotowaniami do SP2 dla Windows XP, generalnym przeglądem kodów pod kątem bezpieczeństwa ale również z decentralizacją samej organizacji MS. Zdaniem fachowców takie etapowe udostępnianie nowych technologii jest korzystne dla developerów ponieważ mogą testować swoje rozwiązania już od razu nie czekając na zakończenie prac z systemem Longhorn.
Rywale cieszą się z opóźnień np. dla Novell jest to na rękę. Dzięki temu jego własny produkt np. iFolder oraz Simias mogą wejść na rynek i uzyskać dobrą markę - takie jest zdanie Miguel de Icaza, szefa nowych technologii w oddziale Novell Ximian Services.
Bazą dla kodu Longhorn będzie Windows Server 2003 SP1, który się ukaże na początku 2005 roku.
Decyzja ta podjęta kilka miesięcy temu jest słuszna, gdyż SP1 dla WS2K3 działa na plaformie 32 - 64 bitowej.
Mając przed sobą tak ogromne zadanie programistyczne i chcąc się zmieścić w terminach lub przyśpieszyć prace należało obciąć funkcjonalność. Potwierdza się fakt, że w wielkich przedsięwzięciach programistycznych dodanie do zespołów nowych ludzi nie przynosi spodziewanego przyśpieszenia prac. To ma miejsce właśnie w przypadku projektu Longhorn.

Co nowego w HP

Okazuje się, że większość zysków w HP przynosi oddział sprzedaży artykułów do urządzeń peryferyjnych (papier i tusz/toner). Dlatego chcąc podkreślić unikalność swoich technologii, HP rozpoczęło szeroką kampanię reklamową mającą na celu wyrobić "swiadomosć" nowej marki dla nowych produktów i nadało nową nazwę handlową swoim produktom np. papieru fotograficznego do drukarek atramentowych Vivera. Do zdobycia jest wielki rynek - wydruków z aparatów cyfrowych (dochodzi do tego rynek tradycyjny - wydruków zwykłych). Potencjalnymi rywalami są Canon i Epson.

piątek, sierpnia 27, 2004

Darmowy obiad? O tak! (ale za ile?)

"Nie ma obiadu za darmo…", zdaje się, że ta stara maksyma kolejny raz się sprawdza. Mianowicie chodzi o oprogramowanie darmowe udostępniane na zasadzie licencji open source. Zdaniem analityka Paul Kirby z AMR Research niektóre firmy ISV (Independent Software Vendors) oferują pewne klasy, kiedyś płatnych, swoich produktów za darmo. Jednak większość z nich to albo: a) wersje light produktów (np. niepełne ale za to jak szybko działające implementacje parserów XML czy "strywializowane" usługi webowe), które sprawdziły się na rynku, lub b) są to przykłady produktów po prostu nieudanych. Przykład:

  1. IBM - Cloudscape (w worku z przejętym Informix-em dostała im się "baza" w Javie) podarowała ją ASF (Apache),
  2. CA - "uwolniła" dinozaura baz danych Ingres i przekazała społeczności FOSS na zasadzie licencji z klauzula zwrotną (tj. zastrzeżeniem sobie praw do jej wykorzystania w produktach komercyjnych).

Altruizm firm komercyjnych jest widoczny w wielu przypadkach takich jak:

  1. SUN - NetBeans przemianowany na Forte for Java lub zbieranie nowych pomysłów na rozszerzenie języka poprzez wolną organizację JCP (Java Community Process),
  2. IBM - Eclipes jako podstawa do produktu WSAD (WebSphere Application Developer).

Istnieją również firmy "kameleony", które najpierw z premedytacją rozwijają darmowe oprogramowanie pisane i testowane przez "ochotników" i po wielu perypetiach pod tytułem wersja 0.9.x.y.x staje się w końcu wersją 1.0 - oczywiście już płatną (bo jak twierdzą firmy muszą zapewnić należyte wsparcie i serwis) weźmy np. MySQL.

Przyklad firm komercyjnych udostepniajacych darmowe oprogramowanie jako "closed source":

  1. BEA - Beehive wersja okrojona profesjonalnego narzedzia do Javy na skale przedsiebiorstwa,
  2. CA - Ingres pelna wersja bazy danej klasy enterprise,
  3. IBM - Cloudscape wbudowane baza danych w Javie
  4. Microsoft - MS SQL Server 2005 Express okrojona wersja (limit pojedynczej bazy 2 GB, jeden procesor, bez ograniczen na ilosc uzytkownikow) MS SQL Server 2005

Oczywiście są też i pozytywne przypadki: Apache Software Foundation, PHP, Perl, Python, Mono, Samba, Sendmail czy JBoss.

Jedno jest pewne - przy wyborze firmy realizującej pewną wybraną technologie czy funkcjonalność nie należy kierować się wyłącznie ceną.
Raport AMR Research oskarża niektóre ISV o eksperymenty z open source w celu poprawienia swojego wizerunku na rynku, zmniejszenia kosztów i zwiększenia konkurencyjności swoich produktów na rynku - czyli "biznes jak zwykle" ("business as usual"). Dlatego wybierając już oprogramowanie GPL musimy przeanalizować prawdziwy powód dla którego jest ono darmowe oraz globalną sytuację w wybranym sektorze produktów i rynku. Czynniki, które należy uwzględnić można scharakteryzować następująco:

  1. coś znanego - najlepszym kandydatem jest mały fragment linii produktu, który jest znany na rynku (nie wybierać całej linii produktów lub produktu, który jest nieznany na rynku),
  2. baza klientów - dobrze gdy ISV udostępnia jako open source produkt mający wielu lojalnych użytkowników bo to naciskają na jakość i niższą cenę,
  3. potencjał dokupienia - kiedy ISV udostępnia bezpłatnie wersję light produktu mając nadzieję, że klient kiedyś wykupi pełną funkcjonalnie wersję to może to oznaczać jego zobowiązanie do długoterminowego wspierania i rozwijaniu linii produktu,
  4. życzliwość - jeżeli głównym sprawcą uczynienia produktu darmowym przez ISV była próba zademonstrowania jego "dobrej woli" to może oznaczać, że wybrał produkt o który rzadko się troszczył, ostrożnie - nic dobrego z tego nie będzie,
  5. wygrabianie słomy - ostrożnie jeżeli ISV chce uzyskać "bezplatne" wsparcie deweloperskie ze strony społeczności open source. Historia zna wiele przypadków ryzyka jakie istnieje gdy wytwórca po nieudanej próbie udoskonalenia za pieniądze wydane na płace dla programistów zwraca się o "pomoc" do ochotników.

Wniosek: Open source daje firmie możliwości skorzystania z wielu opcji i obniża cenę wyrobu, ale bezpłatne oprogramowanie nie zamienia nagle "żaby w księcia". Tanie oprogramowanie może się naprawdę sprawdzić kiedy rzeczywiście mamy takie potrzeby, które jedynie ono zaspokaja.

Problemy i watpliwosci do wyjasnienia

Zadanie:
  1. Jaki typ sterownika preferuje Oracle w środowisku PHP?
  2. Jak podłączyć PHP do Jdevelopera?
  3. Jak wykorzystać Power Designer w generowaniu kodu PL/SQL?
  4. Jak nazywa się firma (chyba ze Szwajcarii), która oferuje "bezbolesną" migrację z T-SQL na PL/SQL i odwrotnie?
  5. Jak ma wyglądać warsztat (pełne srodowsko wytworcze) PHP?
  6. Jak zbudować "rozbiegówkę" do aplikacji internetowych?
  7. Czy tytuł referatu "Porównanie bezpieczeństwa produktów Open Source i Proprietry" (report Forrest + Gartner np. o rozwoju Windows z 25 sierpnia) jest dobry?
  8. Czy jest prawdą twierdzenie, ze Eclipse jednoczy w jednym srodowisku wytwarzanie aplikacji (projektów) prowadzonych w róznych jezykach?
  9. Czy Eclipse nadaje się (ma wtyczki) do tworzenia stron w HTML i "okraszania" ich kodem np. JavaScript, jSP, PHP?
  10. Eclipse jest prawdziwym "kombajnem"developerskim, srodowisku ZOPE tez niewiele mozna zarzucic chyba jedynie to, ze wspiera tylko Pythona (chociaz ich własny wynalazek ZPT i DTML czyli "obudowanie" zwyklego HTML dodatkowymi "tagami" sa rewelacyjne i nie wymagaja wcale korzystania z Pythona)
  11. Wymysleć i podpowiedzieć jakie narzędzia do wspomagania obsługi bazy Oracle z punktu widzenia klienta np. WinSQL, Squirrel, ToRa lub Toad)

Co nowego w Eclipse?

Parę dni temu firma Actuate zaoferowało oddać swój potencjał, oprogramowanie i zasoby ludzkie w celu realizacji kolejnego projektu mającego na celu opracowanie narzędzi raportujących udostępnianych na zasadzie open source. W tym celu weszła do rady nadzorczej Eclipse Foundation. Propozycja nosi nazwę Business Intelligence and Reporting Tools (BIRT) Project i jest poddana pod dyskusję przez 30 dni. Po ewentualnym je przyjęciu oznacza włączenie tego projektu do grupy bieżących projektów nad którymi pracuje ta Fundacja. Obecnie jest otwartych pięć projektów:
  1. Eclipse Project,
  2. Eclipse Tools Project,
  3. Eclipse Technology Project,
  4. Eclipse Web Tools Platform Project,
  5. Test and Performace Tools Platform.

Wszystkie projekty poza pierwszym zajmują się infrastrukturą informatyczną.
Nowy projekt (uwaga: jest to kolejna PODPLATFORMA Eclipse'a i akceptuje plug-iny) skupi się na czterech obszarach:

  1. designer raportów w środowisku Eclipse,
  2. designer raportów w środowisku Web,
  3. silnik do uruchamiania raportów oraz
  4. motor do tworzenia wykresów.

Co z tego ma firma Actuate? Pozwolenie na włączenie (jako plug-in) do BRIT własnego, komercyjnego produktu, możliwość płatnego, profesjonalnego serwisu dla użytkowników w ramach swojego programu Formula One. BRIT wejdzie również w skład ich Actuate iServer - narzędzia wzbogacającego produkt o nazwie enterprise report (raportowanie na poziomie przedsiębiorstwa) o możliwości autentykacji, administrowania, skalowalności i prezentacji).


Pytanie bez odpowiedzi: Jak się ma do tego produkt firmy Micorsoft - Report Server?

czwartek, sierpnia 26, 2004

Intelligent Applications : The Pieces Are Moving (printable version)

The Pieces Are Moving
What's that grinding sound? It's nothing less than strategic business applications shifting to a higher virtual plane.
By David Stodder

Even as the dog days of summer approach and attention turns to vital concerns like sand-dribble castles and fish bait, groaning and grinding continues to emanate from the movement of the IT industry's tectonic plates. Touting utility computing, Hewlett-Packard, IBM, and other major infrastructure providers vie to become your one-stop shop. The ongoing drama of Oracle's attempt to acquire PeopleSoft has removed the veil over carnivorous machinations latent in a whole range of enterprise application vendors. Finally, as Internet computing matures through service-oriented architecture (SOA) — and as radio frequency identification (RFID) technology approaches mainstream — many user organizations are coming to realize that they must break free of older notions and move up to a new vision of business-IT alignment.

Consolidation Conundrums
Thanks to Oracle's PeopleSoft acquisition caper, we've learned that Oracle was also coveting J.D. Edwards, not to mention Lawson Software. Meanwhile, news came out that SAP and Microsoft had been in extensive merger and acquisition talks, even as Microsoft pumped funding into its current Microsoft Business Solutions portfolio, which the company created through earlier acquisitions. SAP's June earnings report showed impressive numbers, including a 45 percent rise in U.S. sales. Siebel Systems, on the other hand, gave investors a heads-up that it would miss its numbers by a fair amount in the second quarter, despite reporting success with its Analytics product over the past year.

Consolidation plus uncertain business prospects in the enterprise applications sector brought consternation to IBM, which views itself as the prime infrastructure partner to many of the vendors in play. News sources reported on internal IBM memos that expressed concern about the company's competitive position given the blending of infrastructure and applications anticipated by the technology directions of Microsoft, Oracle, and SAP.

In early July, determined to strengthen its "middleware" positioning, IBM announced its intention to acquire Alphablox Software. A regular at business intelligence (BI) and data warehousing trade shows, Alphablox always took a little while to understand; however, its distinctive view of BI as an application development problem made Alphablox one of the more innovative software providers out there. If the acquisition goes well, IBM could use Alphablox to move BI out of its "query and reporting" confines by giving a wider spectrum of developers and ISVs the tools with which to build applications focused on leveraging information resources.

Cost Versus Value
While the courts will have their say, further consolidation among application software vendors is bound to come. Incremental dips and turns in demand could trigger mergers and acquisitions, but the true causes are bigger. With packaged applications and services taking care of the meat and potatoes and reducing need for customization, businesses want to focus on development that will deliver competitive advantage. In the CRM arena, all you have to do is look at the success of Salesforce.com to see why the Siebels and PeopleSofts of the world must move quickly to raise their portfolios above basic offerings and focus on what brings customers flexibility and innovation. That's why SOA is such a hot topic. SOA approaches promise to both preserve current investment and enable the addition of modular components.

Cost remains a key factor — and vendors often tout the potential of SOA to reduce the cost of both application management and development. The proof there will be in the pudding: with luck, a tasty pudding, not an uncongealed morass of conflicting standards and proprietary extensions.

However, cost is also about software pricing and licensing, which are in major flux. Embedded components inside business objects, deployment over the Web, and other distributed platform issues complicate how to calculate the bottom line. Wall Street and government regulators also have a say in how they would like to see software providers recognize new sales versus recurring maintenance revenues. Negotiations also must take into account services: New technology adoption depends on a strong business case, which makes business analysis services incredibly important in influencing new sales and licensing.

Page 2

To balance the higher cost of value-added services, vendors and their IT customers are reaching into a familiar bag of tricks: automation, but on a much bigger scale. Automation will involve smarts about not only systems and security management, but also about types of queries, anticipated user and data volumes, and process management. While automation might help create the "black box" some organizations wish their IT resources could become, it could also be where you'll find the strongest link between IT and business objectives. Automation will be essential for implementing business rules, process management, and model-driven knowledge discovery to deliver value from increasingly complex, global operations.

Utility Computing
It's ironic that to better align business and IT some organizations will decide upon the ultimate separation of the two: They'll contract with a major vendor to deliver "utility" computing. IBM's On Demand computing, HP's Adaptive Enterprise, and equivalent solutions from Computer Associates, Sun Microsystems, and other providers are competing for contracts with organizations that have decided that they're not in the IT business. HP aims to "automate the dynamic link between business and IT," according to its literature. The vendors' financial services will arrange pay-as-you-go or other forms of leasing and subscription contracts.

HP and its competitors view automation as critical to enabling business flexibility, essentially so that systems can respond more quickly to change, deliver information more rapidly, and establish consistency, quality, and availability. If a business requires a certain response rate from its e-commerce applications running on Oracle, for example, HP's Adaptive Enterprise solutions (including OpenView) will use automated intelligence to let the system figure out how to make it happen.

Virtualization is also an essential concept of utility computing, as it has been for nearly all modern innovations in software, servers, and storage. Down the road, virtualization will lift organizations above the entire IT function. We're not there yet, but in talking with HP, IBM, and others, the idea is that the IT utility will respond directly to business objectives, which may be expressed through models, rules engines, process management systems, portals, and so forth.

HP and IBM talk about "self-healing" autonomic computing — in HP's case, that gives the system the smarts to immediately replace bad CPUs or other components. But the big idea is to establish some logical plane above the integrated network of systems that allows for swapping in and out underlying components. Grid computing, which has practically become synonymous with utility computing, depends on virtualization. HP envisions blades that can power up with your "personality" so that desktop systems needn't be replaced; this notion could apply to larger components on the grid as well.

RFID: Data in Motion
China has announced that it will give its citizens RFID-enabled identification cards that will combine banking and credit information, driver's license, and possibly even health data into one card. TechWeb News reported recently that Mexico's attorney general and 160 of his employees working at the Mexican anti-crime information center have RFID chips implanted in their arms. These news items signal a trend that could eventually find billions of people wearing RFID chips. In other words, not only will RFID implementation generate tremendous data volumes: Whether on palettes or people, RFID data sources will also be moving around, gathering together and coming apart, and variously playing roles in multiple processes.

RFID presents a form-factor challenge to software vendors; they must put the right applications and data on the tags to enable both real-time and sophisticated trend analysis based on the data. The speed, size, and variety of data flowing through systems using RFID will also force user organizations to climb up to a higher level of abstraction, where they can gain the big picture that allows them to understand business networks of processes and derive knowledge from data generated by those processes.

Will RFID prove to be the "tipping point" that pushes organizations — unable to marshal the talent and funds to act alone — toward utility computing? We shall see; however, it's clear we're at the beginning of some big changes. Vendor consolidation, SOA and the Internet, and RFID are three concurrent factors reshaping enterprise computing, forcing organizations to evolve toward a higher state of intelligence — or risk suffering the fate of the Neanderthals.

David Stodder [dstodder@cmp.com] is Editorial Director and Editor-In-Chief of Intelligent Enterprise.

ISS odkryla "dziure" w projekcie Mozilla

W dniu 24 sierpnia Internet Security System (ISS) ostrzega:
Poważna dziura bezpieczeństwa powszechnie stosowanej technologii pochodzącej z Mozilla Foundation. Chodzi o bibliotekę Netscape Network Security Services library używaną w usługach sieciowych (Web Services), konkretnie o złą implementację protokołu SSL v2.
Niebezpieczeństwo występuje we WSZYSTKICH produktach używających tej technologii. Mozilla Foundation wypuściła stosowną łatkę.

Stowrzenie platformy do zarzadzania srodowiskiem windows

Nawiazujac do poprzedniego artykuly oto mamy przyklad inicjatywy Microsoft:

Microsoft releases key management software
By John Fontana Network World Fusion, 08/25/04
Microsoft Wednesday released the latest versions of its performance and monitoring tool, which forms a cornerstone in its plan to create a management platform for Windows.
Microsoft Operations Manager (MOM) 2005 and MOM Workgroup Edition were released to manufacturing and should be generally available Oct. 1.

MOM 2005 is a major part of Microsoft’s Dynamic System Initiative, a comprehensive management platform being developed for Windows that includes upgrades to Visual Studio as well as infrastructure software.
“MOM is a major step for us,” says David Hamilton, director of the Windows and enterprise management division at Microsoft.
Along with System Management Server (SMS) 2003, MOM 2005 will become the foundation for a new product called System Center, which is due to be released next year. It will feature a common interface to link MOM and SMS and a set of reporting services that will draw information from the two management tools.
Last month, Microsoft’s chief software architect Bill Gates called out MOM, SMS and System Center, along with Windows XP SP2, the next version of Windows Server 2003 (called R2), and Internet Security and Acceleration Server, and said those products would help transform security “from a concern for [Microsoft] into something that’s a significant, unique asset as well as a business opportunity.”
Microsoft has more than 50 Management Packs available for use with MOM, including 20 new packs with the 2005 version. Management packs are linked with certain applications, such as Exchange or SQL Server, and communicate knowledge about an application’s functions and errors to MOM.
The new packs track application state, model health and support the ability to correct errors or restart services or entire servers. A new reporting and analysis engine includes hundreds of pre-built reports.

Microsoft added a wizard setup tool, a configuration checker, discovery and automatic installation of Management Packs to help ease installation. Also new are an Outlook-style interface and a role-based console to tailor MOM for administrators, operations staff and those collecting reports.
MOM 2005 ships with the MOM Connector Framework, which includes connectors to other management platforms, including HP OpenView, IBM Tivoli, and CA Unicenter.
Also, 20 third-party partners announced Management Packs, including Siebel, SAP, Cisco and Dell, while others offered add-ons or integration with MOM, including BindView, eXc Software, JalaSOFT, MetiLinx, Motive, NetIQ, NetPro Computing, Opalis Software, Quest Software, Silect Software, Skywire Software, Tidal Software and Veritas.
MOM 2005 Workgroup Edition offers functionality similar to that of MOM 2005, but is limited to use on 10 devices.
MOM 2005 is priced at $729. Users are required to have an Operations Management License (OML) for each device managed by MOM. An OML five-pack is priced at $2,689. MOM 2005 Workgroup is priced at $499.

Wklejono z <http://www.nwfusion.com/news/2004/0825msmom.html>

Siedem drog do osiagniecia Nirvany w Data Center


Siedem sposobów obniżenia kosztów eksploatacji DataCenter pozwalających obniżyć koszty i zwiększyć efektywność pomimo rosnącego obciążenia.

Nowoczesne centra przetwarzania danych wyposażone w najnowsze rozwiązania technologiczne takie jak wirtualizacja serwerów, wsparcie aplikacji webowych oraz przetwarzanie autonomiczne zakłada w efekcie lepsze wykorzystanie sprzętu, oprogramowania oraz personelu. W miarę jak coraz więcej firm inwestuje w CPD doświadczenie zebrane w trakcie eksploatacji dochodzi jeszcze do tego obniżenie kosztów operacyjnych. Analityki wraz z użytkownikami potwierdzają, że koszty mogą zostać obniżone w granicach od 25% do 90%. Drogą do takich oszczędności jest wdrożenie siedmiu nowych:
1) Konsolidacja pamięci masowej - polega na zgromadzeniu indywidualnych urządzeń pamięci masowej (podsystemy macierzy dyskowych) w hierarchię pamięci najczęściej wykrzystując architekturę SAN (storage-area network) lub NAS (network-attached storage). Wybór jednej z tych architektur zależy od przeznaczenia pamięci masowej. Do przechowywania plików i archiwów najlepiej nadaje się NAS. SAN z kolei nadaje się dobrze do przechowywania danych z baz danych. Dodatkowo w trakcie eksploatacji CPD należy wyznaczyć zarządcę pamięci koordynującego udostępnianie i wykorzystanie tego zasobu,
2) Wirtualizacja serwerów - niestety w każdym centrum można znaleźć wiele nie do końca wykorzystanych serwerów. Uniknąć takiego marnotrawstwa można poprzez konsolidację serwerów w klastry (drogą sprzętową lub specjalnym oprogramowaniem systemowym) i wprowadzenie bardziej efektywnych technologii takich jak blade server i wirtualizacja softwarowa (oprogramowanie VMWare należące do oddziału firmy EMC). Dzięki temu uzyskuje się znaczną efektywność w obszarze wykorzystania sprzętu (mniejsze zużycie prądu na zasilanie serwerów i klimatyzacji, mniejsza powierzchnia centrum, mniej zasilaczy awaryjnych UPS) i obsługi (mniej operatorów). Podobnie jak powyżej należy uwzględnić specyfikę konsolidacji np., że jeden motor bazy zarządza wieloma instancjami bazodanowymi i jego zatrzymanie powoduje przestój wszystkich baz. Dlatego krytycznymi elementami jakie należy uwzględnić podczas konsolidacji jest: zarządzanie zasobami i siecią, zarządzanie zmianami, zarządzanie aktualizacjami oraz testy regresyjne.
3) Konsolidacja centrum przetwarzania danych - obniżenie kosztów eksploatacji oddzielnych CPD powoduje zainteresowanie się konsolidacją kilku CPD w jedno wielkie centrum. Skutkuje to przeniesieniem sprzętu z wielu CPD do jednego centrum położonego w terenie poza terenem miejskim, gdzie można zapewnić większe bezpieczeństwo. Oczywiście taka konsolidacja jest możliwa dzięki dostępności i niezawodności łączy szerokopasmowych.
4) Cienki klient do aplikacji outsourcingowych - obniżenie kosztu wsparcia oprogramowania systemowego i aplikacyjnego poprzez wykorzystanie technologii zdalnego przetwarzania Citrix lub usług webowych (Web Services).
5) Wykorzystanie open source - po skonsolidowaniu sprzętu nadchodzi pora na skonsolidowanie aplikacji. Dobrą strategią obniżenia kosztów jest stopniowa migracja jedna po drugiej tych aplikacji (być oże nawet przeprogramowania) do darmowego systemu operacyjnego Linux. Im więcej jest takich aplikacji, tym większe mogą być oszczędności z tego tytułu. Jest jeden tylko warunek - to prawda, że cena licencji za Linux jest zero złotych, ale inne koszty związane TCO jak przekwalifikowanie kadry, wsparcie i administrowanie systemem, aktualizacja oprogramowania oraz zapewnienie bezpieczeństwa trzeba nadal ponosić.
6) Telefonia internetowa - znaczne obniżenie kosztów telefonicznego wsparcia klientów i użytkowników może przynieść wykorzystanie technologii VoIP. Polega to na obsłudze rozmów telefonicznych z wykorzystaniem łączy internetowych. Obniżeniu ulegają koszty nie tylko samych rozmów ale również zwiększa się przepustowość kanałów telefonicznych i centrali. Naturalnym krokiem po konsolidacji CPD jest konsolidacji central telefonicznych w jedną infrastrukturę opartą o telefonię internetową. Przy okazji zmniejszeniu ulegają koszty utrzymania personelu konserwującego centrale.
7) Autonomiczne przetwarzanie - jednym z kluczowych czynników obniżenia kosztów eksploatacji CPD jest automatyzacja pracy centrum np. w zakresie utrzymania aplikacji, monitorowaniu pracy całej infrastruktury IT, uruchamianiu alarmów bezpieczeństwa, wykonywania aktualizacji oraz łatania dziur w zabezpieczeniach (narzędzia z firm St.Bernard Software oraz Microsoft - Windows Update Server). Dzięki automatyzacji obsługi eksploatacyjnej istniejących aplikacji ten sam liczebnie personel może wspierać wdrażanie nowych aplikacji (np. korzystanie z narzędzia NetIQ application manager). Oferuje to dużą oszczędność z punktu widzenia zasobów ludzkich.

HP odklada na polke projekt antywirusowy

HP puts choke hold on virus throttling product
HP zaprzestała rozwijać swój wewnętrzny produkt antywirusowy z uwagi na przeszkody jaki napotkała w implementacji tego rozwiązania w środowisku Windows. Ogłoszony w lutym projekt serwisu antywirusowego miał skutecznie zatrzymywać wirusy i robaki w heterogenicznym środowisku sieciowym poprzez: ograniczenie możliwości ataku Ddos (komponent Virus Throtler) oraz skanowanie sieci (Active Countermeasures). Oba te produkty sprawdziły się znakomicie w wewnętrznej sieci korporacyjnej HP liczacej 247 tyś. Komputerów. Produkty miały wejść na rynek w 2005 roku i mialy oznaczać nowe proaktywne (zamiast defensywnego) podejście do problemu wirusów. Problemy z Windows spowodowały, że oba te produkty wróciły do laboratorium.

środa, sierpnia 25, 2004

Badania organizowane przez University of Michigan American Customer Satisfaction Index (ACSI)

W tym roku index ASCI wynosi 72.5 na 100 punktów. Badano trzy kategorie:

  • national cross industry - 74.4
  • e-commerce (podróże on-line, sprzedaż, aukcje, zakłady) - 80.8
  • e-business (portale, wyszukiwarki i wiadomosci) - 72.5

E-business tak się rozkłada:

  1. Portale - 71 tj. Yahoo (78), MSN (75) oraz AOL (67)
  2. wyszukiwarki - 80 tj. Google (82), Ask Jeeves (71)
  3. wiadomości prasowe - 75 tj. MSNBC, ABCNew, CNN dają 74, razem inne - 75 )

Zwrócono uwagę na zacieranie się granic między w/w trzema składowymi e-business np. Yahoo rozszerza się na obszar wyszukiwarek, Google wprowadza elementy portala.

Unia ponownie rozpatruje zachowanie się firm amerykańskich na rynku europejskim

Chodzi o ponowne rozpatrzenie wniosku o ukaranie MS za praktyki monopolistyczne w dziedzinie systemu operacyjnego Windows i Media Player, oraz rozpatrzenie zgody na przejęcie firmy ContentGuard Holdings Inc. specjalizującej się w DRM (digital right management) przez MS oraz Time Warner. Konsekwencją tego ostatniego jest zmonopolizowanie rynku ochrony praw do zasobów cyfrowych przez w/w firmy. Przewiduje się, że po przejęciu firmy ContentGuard spowoduje umocnienie dominującej pozycji MS. Rynek DRM rozwija się dynamicznie; wg. IDC w 2007 r. przychodu mogą wzrosnąć do 570 mln $ w tym 70% będzie pochodzić ze środowiska Windows.

The European Commission (EC) opened an in-depth investigation on Wednesday into plans by Microsoft Corp. and Time Warner Inc. to take over U.S. digital rights management (DRM) company ContentGuard Holdings Inc.

The transaction might create or strengthen a dominant position by Microsoft in the digital rights management market, the Commission said in a statement.

As part of the investigation the Commission will also investigate further competitive concerns related to the vertical integration of Microsoft in other markets, it said.

Microsoft had no comment on the decision early Wednesday. Time Warner, based in New York, was not immediately available to comment.

In April, Microsoft announced that it was increasing its investment in the Bethesda, Maryland, DRM firm, while Time Warner said that it was adding a new cash injection. Together the companies purchased substantially all the ownership previously held by ContentGuard's original technology provider Xerox Corp. The value of their investments has not been disclosed.

They sought clearance for the deal from the E.U. in July, when the Commission opened a first-phase investigation. The in-depth second phase investigation could take up to four months, the Commission said.

Both Microsoft and Time Warner are keen to invest in DRM, which is used to protect digital content against illegal uses. Concerns have been raised, however, that Microsoft and Time Warner could wield their combined power in the software and media markets, respectively, to dominate the DRM market.

The in-depth probe will evaluate whether the ContentGuard deal will put Microsoft's rivals in the DRM market at a disadvantage or whether the joint acquisition will slow development of open interoperability standards, the Commission said Wednesday.

Such outcomes could tip the DRM market toward the current leading provider, Microsoft, the Commission said.

The ContentGuard probe follows the E.U.'s scrutiny of Microsoft's dominance in the PC operating systems market. The Commission wrapped up its five-year investigation into the software maker earlier this year when it levied a fine of €497.2 million, or around $600 million, and ordered Microsoft to offer a version of its Windows operating system without the Windows Media Player software for abusing its dominance in the market. Microsoft has appealed the decision.

The worldwide DRM market is expected to grow quickly over the next few years, generating revenue of around $563 million by 2007, according to IDC. Furthermore, the researcher predicted that over 70 percent of DRM revenue in 2007 would be derived from the Windows operating environment

Windows XP SE co o tym mysli Gartner?

Opinia Gartnera na temat Windows XP SE - nie jest dobrym pomysłem ograniczenia funkcjonalności Windows-ów ponieważ MS uwzględnił doświadczenia first-time user zamiast first-owner user (wielu użytkowników ma dostęp do oprogramowania MS w kafejkach internetowych).

According to Gartner, Microsoft has made the mistake of focusing on first-time users and not first-time owners. "Many citizens who do not own a PC are already familiar with basic PC use from cyber-cafes and schools," said Martin Gilliland, Principal Analyst at Gartner. "XPSE is likely to frustrate these users as it is not delivering the same quality experience due to the limitations imposed and the failure to allow the operating system to grow with users as they gain experience. We believe this will result in increased piracy as Microsoft has no upgrade path unless users pay full retail price for the Windows XP Home edition."
According to Gilliland, Microsoft has put significant effort into its XPSE having studied 1,000 first-time users in Thailand for almost a year following the launch of the Thai ICT PC programme last year. As a result, a number of new features that help first-time users have been added including a new support centre, tutorials on how to use the mouse and beginners' guides to using Windows and common applications. A number of features that are of little relevance to a first-time user - such as those that simplify overall use such as file and print sharing and local area network support - have also been removed from the operating system.
"While Microsoft should be commended for these efforts, they fall far short in other areas," said Wiggins. "The most significant is the deliberate crippling of the operating system to allow just three applications to run at any one time. Microsoft claims this provides a simpler end-user experience. But if a user were to run Yahoo! Instant Messenger, Microsoft Instant Messenger and an Email client they could not open a web browser or anything else for that matter." The cut-down version of the XP operating system also restricts the hardware the end-users can run. XPSE will not recognize more than 128MB or RAM or 40GB of HDD. On top of this the maximum video resolution is set at 800 x 600.
Wiggins added that although XPSE ships with XP SP2 installed, Microsoft has also failed to address security issues such as ongoing patch distribution on slow and expensive connections and anti-virus. He said the company had also failed to provide the user with any education in these areas.
"While Microsoft has made great improvements for the first-time user experience, it still fails to meet the most basic needs," said Gilliland.

Wklejono z

Rozne rzeczy z sieci

  1. Make it Acrobat 6.0. And work more securely. Adobe Acrobat offers flexible security options,
    proven reliability and is already a document standard with thousands of businesses and government agencies. Download a free tryout of Adobe Acrobat 6.0 today.
    http://nl.internet.com/ct.html?rtr=on&s=1,12sw,1,elb5,6n79,48gk,84du
  2. Ciekawe odnośniki do domeny internet.com's network of more than 160 Web sites is organized into 12channels:
    Developer http://internet.com/webdev/ Download http://internet.com/downloads/
    International http://internet.com/international/
    Internet Lists http://internet.com/lists/
    Internet News http://internet.com/news/
    Internet Resources http://internet.com/resources/
    IT http://internet.com/it/
    Small Business http://internet.com/sb/
    Linux/Open Source http://internet.com/linux/
    Windows Technology http://internet.com/win/
    Wireless Internet http://internet.com/wireless/
    xSP Resources http://internet.com/xsp/
  3. Ciekawy artykuł na temat pozycjonowania znalezionych przez wyszukiwarki internetowe stron (Yahoo i Google). Użytkownik jest trochę przerażony możliwością "odkrycie" jego upodobań w trakcie przeszukiwania zasobów Internetu. Z jednej strony jest mu potrzebny tak mechanizm który uwzględnia jego upodobania, ale z drugiej strony zdaje sobie sprawę, że te informacje mogą być wykorzystane przeciwko niemu. Oto propozycje czołowych wyszukiwarek: http://www.searchnewz.com/searchnewz-12-20040823SearchEnginePersonalizationTheFallout.html
  4. Odnośnie XP SP2 "... There are three methods offered by Microsoft to disable automated patching: an executable file (to run on each XP computer to change a registry setting); a group policy template (to apply to Active Directory); or a URL embedded in an e-mail message to each user..." Czy nie aby ten trzeci sposób nie wygląda na działanie poprzez mechanizm podobny do wirusów w poczcie?
  5. Today's focus: Getting XP security updates without SP2 By Steve Blass
    --------------------------------------------------------------------
    Can we disable delivery of Windows XP Service Pack 2 through Automatic Updates and Windows Update without blocking the delivery of other critical security updates?
    Microsoft disabled XP SP2 updates for 120 days starting Aug. 16 because many companies wanted to test the service pack before it got automatically installed.
    Update control tools are available from Microsoft ( ). Each tool uses a different method to create a new registry key - "HKLM\Software\Policies\Microsoft Windows\WindowsUpdate" with the value "DoNotAllowXPSP2."
    There is a template for companies that have implemented Active Directory-based Group Policy that centrally disables and enables delivery of SP2. This tool kit includes software that can run on individual PCs for companies that don't use Active Directory Group Policy.
    sample script that accepts a machine name as a parameter is provided to support execution through logon scripts or remote script execution commands. A sample e-mail containing an update control URL is provided ( ) so users can disable and re-enable SP2 updates through a Web browser.
  6. NETWORK WORLD NEWSLETTER: DENNIS DROGSETH ON NETWORK/SYSTEMS MANAGEMENT 08/23/04 Today's focus: Is quality of experience beyond SLAs?
    Today's focus: Is quality of experience beyond SLAs?
    By Dennis Drogseth
    In this column I am going to press the point that quality of experience sets all traditional notions of service-level agreements on their heels. Now - not later - is the time to make the mental leap. QoE represents a fundamental shift in how SLAs can be defined.
    Taken at face value, QoE is exactly what it sounds like - the quality of experience. "Experience" is defined in my Oxford American Dictionary as "an actual observation of facts and events," and, as a verb, "to observe, to share, to actually be affected by - a feeling." What's interesting from these definitions is that the word itself combines two very different dimensions. One is a more empirical sense of observed reality, while the other includes sensation and imagination - it is about feeling.
    Both definitions play in QoE - which reflects a very different agenda than traditional SLAs. Rather than simply building from what's measurable up to the customer or end user, QoE would suggest starting with the end user, honoring the objective and subjective merits of his or her experience and trying to approximate them in metrics that can be validated in terms of technical performance and customer behavior.
    You already may be thinking that this approach is an unhealthy combination of masochism and naiveté, but I would argue just the opposite - it is the shortest path to comfort and mental health for you and your customers.
    Business productivity, customer loyalty, and business partnerships depend on QoE in all its dimensions. No one will stick with a provider that gets gold stars for SLAs but still leaves them experientially discontent - especially if other options present themselves. By trying to force you and your customers to live in a simulated universe in which only technical metrics apply, it is you who are being naïve. Sure, you will need to "manage" expectations and set some technical boundaries, but your ability to do this successfully is greatly enhanced once you approach the problem in terms of multi-dimensional experience rather than introverted technical specifications.
    A few pointers and observations:
    * Listen to your customers. While the old-fashioned help desk approach is often reactive and cumbersome, it can also provide useful background on customer perceptions and requirements. A strong, proactive service initiative will also help to promote dialog and interaction.
    * Recognize that while availability and performance remain prime factors, there are other dimensions to QoE - such as consistency, cost to the customer, security, flexibility (e.g., mobility of a service, or customer choice of service), and variety (number of available and customer-relevant services). This is not a finite list - because the dimensions of experience are not finite.
    * Look at options for testing responsiveness. Since degraded service has proven to be more of a customer turnoff than intermittent spurts of lack of availability, performance and QoE are probably the two most closely linked metrics. Until fairly recently, synthetic transaction analyses were the top choice for QoE validation, and they do still play a role. Synthetic transactions provide IT with a self-contained context for control. You can set the time and frequency and define SLAs accordingly - and of course synthetic transactions are superior for testing availability.
    New technologies - including slimmer, more efficient agent technology, more advanced server-based transaction analysis, and significant advances in techniques for packet analysis - are making observed transactional baselining more possible. Unlike synthetic transactions, observed baselining can inform you, on a dynamic basis, of actual customer behavior and customer disaffection - for example, when transactions are aborted due to impatience. Some techniques are now highly scalable in capturing individual user behaviors as well as infrastructure performance in large, geographically dispersed environments.
    These are just a few points. I invite your comments and opinions as well, and welcome intelligent disagreement and notes of support.
    Oh, and to answer the question posed in the headline: in my opinion, the glorious and troublesome fact is that QoE is indeed beyond SLAs, which can only, at best, approximate it - and that's because experience, itself, is more sprawling than the Internet, and more complex than all the data centers in the world.
    RELATED EDITORIAL LINKS
    End-user SLAs: Guaranteeing 'real' service levels
    Network World Outsourcing Newsletter, 07/30/03
    http://www.nwfusion.com/newsletters/asp/2003/0728out1.html
    Why quality of experience is important
    http://www.nwfusion.com/newsletters/nsm/2002/01502703.html
    IOS changes could alter face of Cisco routers
    Network World, 08/23/04
    http://www.nwfusion.com/news/2004/082304cisco.html?nl2
    CA looks to reduce 'integration tax'
  7. Software maker exposes hidden data
    Published: August 23, 2004, 9:30 AM PDT
    By David Becker
    Staff Writer, CNET News.com
    Workshare, a specialist in collaboration software built around Microsoft Office applications, is aiming to alert businesses to the danger of hidden data lurking in their documents.
    The company on Monday launched Metadatarisk.org, a Web site with information on the dangers posed by hidden metadata in documents. The site includes Metafind, a downloadable tool for automatically analyzing and exposing metadata in documents posted on a given Web site.
    "There's up to 25 different types of hidden metadata that exists in Microsoft documents," said Matthew Brown, Workshare product manager. "And the more documents get passed around, the bigger the risk becomes."
    Metadata, hidden information that can specify everything from a document's creator to deleted text, has become a growing risk for companies. British Prime Minister Tony Blair was embarrassed last year, when documents meant to bolster his cause for intervention in Iraq contained metadata with information that contradicted the official position. After examining metadata in a legal document of Linux adversary the SCO Group, CNET News.com learned that SCO originally planned to sue Bank of America.
    Word and other Office applications include tools for removing such metadata before a document is shared with others, but those capabilities are used inconsistently at best, Brown said. "It's something where it really needs to be part of company policy--how you deal with metadata," he said. "If you don't create and enforce a good policy about cleaning up after yourselves, there's a real risk."
    Workshare includes metadata removal tools in its namesake product, an application intended to enhance a company's ability to share and manage Office documents. The company also sells a separate product, Workshare Protect, which automatically strips metadata from documents before they leave a company's network.
    "Our vision is to encourage collaboration around Microsoft documents--but to do it securely," Brown said. "Collaboration is a very important part of today's working practices, but it does present some new risks."
  8. Jim Hugunin, the moving force behind IronPython and co-designer of AspectJ, is now a member of Redmond's Common Language Runtime team.

    Microsoft is continuing to grab top developer talent. The latest catch: Open source stalwart Jim Hugunin. Hugunin created Jpython/Jython; codesigned the AspectJ aspect-oriented-programming language while working at the Xerox PARC research center; and is the moving force behind IronPython, the implementation of the Python language targeted at .Net and Mono. Hugunin has joined Microsoft's Common Language Runtime team, where he will work on furthering Microsoft's support for dynamic languages. (Dynamic programming languages enable programs can change their structure as they run.)
    Hugunin started with Microsoft on August 2. But he hasn't completely abandoned the open source fold.
    A posting on the Iron Python home page said Hugunin plans to continue to work on Iron Python from inside Microsoft. The first public version of IronPython was released on July 28 under the Common Public License, an open-source license http://www.eweek.com/article2/0,1759,1495495,00.asp
    "JimHugunin (sic) has announced that he is going to join the CLR team at Microsoft, to continue his work on IronPython, and further improve the CLR's support for dynamic languages," reads the posting on the Iron Python site.
    Since joining Microsoft, Hugunin has launched a blog on Microsoft's Microsoft Developer Network (MSDN) site.
    "Over the past year, I've become a reluctant convert to the CLR.," he said in his first post. "My initial plan was to do a little work and then write a short pithy article called, 'Why .NET is a terrible platform for dynamic languages.' My plans changed when I found the CLR to be an excellent target for the highly dynamic Python language."
    Check Out Hugunin's Site New Microsoft Blog = http://blogs.msdn.com/hugunin/
    While many Microsoft staffers posting to their own Web logs seemed most interested in Hugunin's Python roots, his Java-savvy also could be of use to Microsoft.
    AspectJ is an aspect-oriented extension to the Java programming language that is currently overseen by the Eclipse.org standards body. And, as Hugunin noted on his personal Web site, "Jython is frequently cited as compelling evidence that the JVM (Java Virtual Machine) is an effective platform for languages other than Java when making comparisons to Microsoft's CLR
  9. The company on Monday launched Metadatarisk.org, a Web site with information on the dangers posed by hidden metadata in documents. The site includes Metafind, a downloadable tool for automatically analyzing and exposing metadata in documents posted on a given Web site.
    "There's up to 25 different types of hidden metadata that exists in Microsoft documents," said Matthew Brown, Workshare product manager. "And the more documents get passed around, the bigger the risk becomes."
    Metadata, hidden information that can specify everything from a document's creator to deleted text, has become a growing risk for companies. British Prime Minister Tony Blair was embarrassed last year, when documents meant to bolster his cause for intervention in Iraq contained metadata with information that contradicted the official position. After examining metadata in a legal document of Linux adversary the SCO Group, CNET News.com learned that SCO originally planned to sue Bank of America.
    Word and other Office applications include tools for removing such metadata before a document is shared with others, but those capabilities are used inconsistently at best, Brown said. "It's something where it really needs to be part of company policy--how you deal with metadata," he said. "If you don't create and enforce a good policy about cleaning up after yourselves, there's a real risk."
    Workshare includes metadata removal tools in its namesake product, an application intended to enhance a company's ability to share and manage Office documents. The company also sells a separate product, Workshare Protect, which automatically strips metadata from documents before they leave a company's network.
    "Our vision is to encourage collaboration around Microsoft documents--but to do it securely," Brown said. "Collaboration is a very important part of today's working practices, but it does present some new risks."

wtorek, sierpnia 24, 2004

Nowe pojecie jakosci obslugi uzykownika

Today's focus: Is quality of experience beyond SLAs?

By Dennis Drogseth

In this column I am going to press the point that quality of
experience sets all traditional notions of service-level
agreements on their heels. Now - not later - is the time to make
the mental leap. QoE represents a fundamental shift in how SLAs
can be defined.

Taken at face value, QoE is exactly what it sounds like - the
quality of experience. "Experience" is defined in my Oxford
American Dictionary as "an actual observation of facts and
events," and, as a verb, "to observe, to share, to actually be
affected by - a feeling." What's interesting from these
definitions is that the word itself combines two very different
dimensions. One is a more empirical sense of observed reality,
while the other includes sensation and imagination - it is about
feeling.

Both definitions play in QoE - which reflects a very different
agenda than traditional SLAs. Rather than simply building from
what's measurable up to the customer or end user, QoE would
suggest starting with the end user, honoring the objective and
subjective merits of his or her experience and trying to
approximate them in metrics that can be validated in terms of
technical performance and customer behavior.

You already may be thinking that this approach is an unhealthy
combination of masochism and naiveté, but I would argue just the
opposite - it is the shortest path to comfort and mental health
for you and your customers.

Business productivity, customer loyalty, and business
partnerships depend on QoE in all its dimensions. No one will
stick with a provider that gets gold stars for SLAs but still
leaves them experientially discontent - especially if other
options present themselves. By trying to force you and your
customers to live in a simulated universe in which only
technical metrics apply, it is you who are being naïve. Sure,
you will need to "manage" expectations and set some technical
boundaries, but your ability to do this successfully is greatly
enhanced once you approach the problem in terms of
multi-dimensional experience rather than introverted technical
specifications.

A few pointers and observations:

* Listen to your customers. While the old-fashioned help desk
approach is often reactive and cumbersome, it can also provide
useful background on customer perceptions and requirements. A
strong, proactive service initiative will also help to promote
dialog and interaction.

* Recognize that while availability and performance remain prime
factors, there are other dimensions to QoE - such as
consistency, cost to the customer, security, flexibility (e.g.,
mobility of a service, or customer choice of service), and
variety (number of available and customer-relevant services).
This is not a finite list - because the dimensions of experience
are not finite.

* Look at options for testing responsiveness. Since degraded
service has proven to be more of a customer turnoff than
intermittent spurts of lack of availability, performance and QoE
are probably the two most closely linked metrics. Until fairly
recently, synthetic transaction analyses were the top choice for
QoE validation, and they do still play a role. Synthetic
transactions provide IT with a self-contained context for
control. You can set the time and frequency and define SLAs
accordingly - and of course synthetic transactions are superior
for testing availability.

New technologies - including slimmer, more efficient agent
technology, more advanced server-based transaction analysis, and
significant advances in techniques for packet analysis - are
making observed transactional baselining more possible. Unlike
synthetic transactions, observed baselining can inform you, on a
dynamic basis, of actual customer behavior and customer
disaffection - for example, when transactions are aborted due to
impatience. Some techniques are now highly scalable in capturing
individual user behaviors as well as infrastructure performance
in large, geographically dispersed environments.

These are just a few points. I invite your comments and opinions
as well, and welcome intelligent disagreement and notes of
support.

Oh, and to answer the question posed in the headline: in my
opinion, the glorious and troublesome fact is that QoE is indeed
beyond SLAs, which can only, at best, approximate it - and
that's because experience, itself, is more sprawling than the
Internet, and more complex than all the data centers in the
world.

Proba z OneNote

Friday 5 March 2004
CA disputes Linux licence claim by SCO
Computer Associates has criticised SCO for misrepresenting the terms of a software licensing arrangement between the two companies that protected CA from a potential SCO lawsuit.
SCO chief financial officer Bob Bench confirmed that CA was one of four publicly named companies to sign up for SCO's Intellectual Property Licence for Linux, a $699 licence which, SCO said, Linux users must purchase in order to avoid violating SCO's copyrights.
However, a CA executive said that his company had purchased no such licence, but had instead acquired a large number of licences for SCO's UnixWare operating system as part of a $40m breach of contract lawsuit settlement in August 2003 with SCO investor The Canopy Group.
Around the time of the settlement, SCO announced that it had signed up the first customer for its Linux licence. Although SCO did not reveal the identity of this customer industry speculation centred around it being CA.
By acquiring the UnixWare licences, CA indemnified itself against a possible Linux lawsuit from SCO, said Sam Greenblatt, the senior vice-president and chief architect of CA's Linux Technology Group.
"We did an agreement with the Canopy Group and in the agreement with the Canopy Group, we acquired UnixWare licences," he said. "For every UnixWare licence you acquired, you got indemnified for that number of Linux licences."
SCO spokesman Blake Stowell disagreed with Greenblatt's characterisation, saying that CA had, indeed, obtained an IP Licence for Linux.
“UnixWare licences allow SCO customers to run UnixWare, and the SCO Intellectual Property Licence allows Linux end users to run our Unix intellectual property in binary form in Linux. Today, CA has a licence in place to run our Unix IP in binary form in Linux without fear that they may be infringing on our intellectual property,” he said.
Greenblatt strongly objected to the portrayal of CA as a IP Licence for Linux customer.
"To represent us as having supported the SCO thing is totally wrong," he said.
Greenblatt had harsh words for SCO and the company's chief executive officer, Darl McBride, whose tactics were "intended to intimidate and threaten customers".
"We totally disagree with his approach, his tactics and the way he's going about this."
Separately, another company mentioned as a SCO Linux licensee denied knowledge of any such agreement.
Although SCO's Bench had confirmed Carthage Missouri's Leggett & Platt as a licensee, a spokesman for the manufacturing company said that he had no knowledge of such a deal.
"I have now talked to our people who handle our Linux systems and, at least at a corporate level, we have not bought such a licence from SCO Group," said John Hale, the company's vice-president of human resources.
"It's conceivable - we're a large, far-flung corporation - that some unit of Leggett & Platt in some part of the country may have been persuaded to buy such a licence, but if they did we are not aware of it," Hale said.
One financial analyst said that the conditions surrounding the CA licence did not cast a favourable light on SCO, which has claimed that Linux illegally contains some of its Unix code.
"I think it just speaks to the weakness of their case. Why could [CA] have not been convinced to take a licence without legal action?" said Dion Cornett, a managing director with Decatur Jones Equity Partners.
The other two companies named as IP Licence for Linux customers are Houston-based EV1Servers.net and Questar. Both have confirmed that they did purchase SCO's licence.

poniedziałek, sierpnia 23, 2004

Z blogg-u ZDNET

Ostatnio dziennikarze ZDNET wzięli w obronę stanowisko MS w sprawie wypuszczenia SP2, twierdzą, że jest on bezwzględnie potrzebny i należy go instalować mimo wiadomości o odkryciu w nim "dziur". Twierdzą, że użytkownicy mają zbyt wygórowane oczekiwania odnośnie zapewnienia im bezpiecznego poruszania po Internecie. Wiadomo przecież, udział we włamaniu do PC-ta jest taki: 10% wady zabezpieczń systemu, 40% social enginering i 50% głupota użytkownika. Dużą wadą przeglądarki IE jest a) jej pełna i scisła integracja z systemem operacyjnym i b) wykorzystanie technologii COM i ActiveX.

niedziela, sierpnia 22, 2004

Niedziela

Artykul z www.business2.com
Skąd czerpią pieniądze firmy sprzedające produkty open-source? Wszystkie firmy software'owe sprzedają nie programy ale licencje na ich użytkowanie (mają przy tym stopę zysku - profit margin - prawie 100%).
But what's the value of a license to a customer? A license doesn't deliver the code, provide the utilities to get a piece of software running, fix bugs, answer the phone when something inevitably goes wrong, or tweak the software to meet a special need. The value of software, in short, doesn't lie in the software alone. The value is in making sure the software does its job. Just as a traveler should look at the overall price of a vacation package instead of obsessing over the price of the plane ticket or hotel room, a smart tech buyer won't focus on how much the license costs and ignore the support contract or the maintenance agreement. He'll look at the total package and decide if it's worth it.


Open-source is not that different. If you want the software to work, you have to pay to ensure it will work. The open-source companies have refined the software model by selling subscriptions. They roll together support and maintenance and charge an annual fee, which is a healthy model, though not quite as wonderful as Microsoft's money-raking one.


Tellingly, even Microsoft is casting an envious eye at aspects of the open-source business model. The company has been taking halting steps toward a similar subscription scheme for its software sales. Under a software subscription, how much will you be paying for the license part of the package? You won't know or even care. It might as well be zero, because the value of software is in the results it delivers, not how you get your hands on it.


That may explain why Microsoft's subscription program, known as Software Assurance, is struggling. Software Assurance provides maintenance and support together with a software license. It lets you upgrade to Microsoft's next version of the software for a predictable sum. But it also contains an implicit threat: If you don't switch to Software Assurance now, who knows how much Microsoft will charge you when you decide to upgrade?


Chief information officers hate this kind of "assurance," since they're often perfectly happy running older versions of software that are proven and stable. Microsoft, on the other hand, rakes in the software-licensing fees only when customers upgrade. Software Assurance is Microsoft's attempt to get those same licensing fees but wrap them together with the service and support needed to keep systems running. That's why Redmond finds the open-source model so threatening: Open-source companies have no vested interest in getting more licensing fees and don't have to pad their service contracts with that extra cost. In the end, the main difference between open-source and proprietary software companies may be the size of the check you have to write